[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 24160 1726853523.19630: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-Qi7 executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 24160 1726853523.20056: Added group all to inventory 24160 1726853523.20058: Added group ungrouped to inventory 24160 1726853523.20062: Group all now contains ungrouped 24160 1726853523.20065: Examining possible inventory source: /tmp/network-iHm/inventory.yml 24160 1726853523.35397: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 24160 1726853523.35460: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 24160 1726853523.35482: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 24160 1726853523.35536: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 24160 1726853523.35607: Loaded config def from plugin (inventory/script) 24160 1726853523.35609: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 24160 1726853523.35645: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 24160 1726853523.35729: Loaded config def from plugin (inventory/yaml) 24160 1726853523.35731: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 24160 1726853523.35816: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 24160 1726853523.36207: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 24160 1726853523.36211: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 24160 1726853523.36214: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 24160 1726853523.36220: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 24160 1726853523.36225: Loading data from /tmp/network-iHm/inventory.yml 24160 1726853523.36295: /tmp/network-iHm/inventory.yml was not parsable by auto 24160 1726853523.36360: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 24160 1726853523.36400: Loading data from /tmp/network-iHm/inventory.yml 24160 1726853523.36484: group all already in inventory 24160 1726853523.36491: set inventory_file for managed_node1 24160 1726853523.36495: set inventory_dir for managed_node1 24160 1726853523.36496: Added host managed_node1 to inventory 24160 1726853523.36498: Added host managed_node1 to group all 24160 1726853523.36499: set ansible_host for managed_node1 24160 1726853523.36500: set ansible_ssh_extra_args for managed_node1 24160 1726853523.36503: set inventory_file for managed_node2 24160 1726853523.36506: set inventory_dir for managed_node2 24160 1726853523.36507: Added host managed_node2 to inventory 24160 1726853523.36508: Added host managed_node2 to group all 24160 1726853523.36509: set ansible_host for managed_node2 24160 1726853523.36510: set ansible_ssh_extra_args for managed_node2 24160 1726853523.36512: set inventory_file for managed_node3 24160 1726853523.36515: set inventory_dir for managed_node3 24160 1726853523.36515: Added host managed_node3 to inventory 24160 1726853523.36516: Added host managed_node3 to group all 24160 1726853523.36517: set ansible_host for managed_node3 24160 1726853523.36518: set ansible_ssh_extra_args for managed_node3 24160 1726853523.36520: Reconcile groups and hosts in inventory. 24160 1726853523.36524: Group ungrouped now contains managed_node1 24160 1726853523.36526: Group ungrouped now contains managed_node2 24160 1726853523.36528: Group ungrouped now contains managed_node3 24160 1726853523.36605: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 24160 1726853523.36726: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 24160 1726853523.36778: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 24160 1726853523.36804: Loaded config def from plugin (vars/host_group_vars) 24160 1726853523.36807: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 24160 1726853523.36813: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 24160 1726853523.36821: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 24160 1726853523.36865: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 24160 1726853523.37190: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853523.37287: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 24160 1726853523.37321: Loaded config def from plugin (connection/local) 24160 1726853523.37323: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 24160 1726853523.37959: Loaded config def from plugin (connection/paramiko_ssh) 24160 1726853523.37963: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 24160 1726853523.38844: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 24160 1726853523.38889: Loaded config def from plugin (connection/psrp) 24160 1726853523.38892: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 24160 1726853523.39627: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 24160 1726853523.39669: Loaded config def from plugin (connection/ssh) 24160 1726853523.39674: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 24160 1726853523.41560: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 24160 1726853523.41602: Loaded config def from plugin (connection/winrm) 24160 1726853523.41605: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 24160 1726853523.41637: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 24160 1726853523.41702: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 24160 1726853523.41769: Loaded config def from plugin (shell/cmd) 24160 1726853523.41772: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 24160 1726853523.41799: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 24160 1726853523.41868: Loaded config def from plugin (shell/powershell) 24160 1726853523.41870: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 24160 1726853523.41922: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 24160 1726853523.42105: Loaded config def from plugin (shell/sh) 24160 1726853523.42107: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 24160 1726853523.42141: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 24160 1726853523.42266: Loaded config def from plugin (become/runas) 24160 1726853523.42268: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 24160 1726853523.42451: Loaded config def from plugin (become/su) 24160 1726853523.42456: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 24160 1726853523.42609: Loaded config def from plugin (become/sudo) 24160 1726853523.42611: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 24160 1726853523.42640: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_disabled_nm.yml 24160 1726853523.43042: in VariableManager get_vars() 24160 1726853523.43065: done with get_vars() 24160 1726853523.43192: trying /usr/local/lib/python3.12/site-packages/ansible/modules 24160 1726853523.46100: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 24160 1726853523.46207: in VariableManager get_vars() 24160 1726853523.46212: done with get_vars() 24160 1726853523.46215: variable 'playbook_dir' from source: magic vars 24160 1726853523.46216: variable 'ansible_playbook_python' from source: magic vars 24160 1726853523.46216: variable 'ansible_config_file' from source: magic vars 24160 1726853523.46217: variable 'groups' from source: magic vars 24160 1726853523.46218: variable 'omit' from source: magic vars 24160 1726853523.46218: variable 'ansible_version' from source: magic vars 24160 1726853523.46219: variable 'ansible_check_mode' from source: magic vars 24160 1726853523.46220: variable 'ansible_diff_mode' from source: magic vars 24160 1726853523.46220: variable 'ansible_forks' from source: magic vars 24160 1726853523.46221: variable 'ansible_inventory_sources' from source: magic vars 24160 1726853523.46222: variable 'ansible_skip_tags' from source: magic vars 24160 1726853523.46222: variable 'ansible_limit' from source: magic vars 24160 1726853523.46223: variable 'ansible_run_tags' from source: magic vars 24160 1726853523.46224: variable 'ansible_verbosity' from source: magic vars 24160 1726853523.46259: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml 24160 1726853523.46740: in VariableManager get_vars() 24160 1726853523.46759: done with get_vars() 24160 1726853523.46799: in VariableManager get_vars() 24160 1726853523.46819: done with get_vars() 24160 1726853523.46856: in VariableManager get_vars() 24160 1726853523.46869: done with get_vars() 24160 1726853523.46990: in VariableManager get_vars() 24160 1726853523.47004: done with get_vars() 24160 1726853523.47009: variable 'omit' from source: magic vars 24160 1726853523.47027: variable 'omit' from source: magic vars 24160 1726853523.47063: in VariableManager get_vars() 24160 1726853523.47076: done with get_vars() 24160 1726853523.47120: in VariableManager get_vars() 24160 1726853523.47133: done with get_vars() 24160 1726853523.47169: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 24160 1726853523.47382: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 24160 1726853523.47506: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 24160 1726853523.48127: in VariableManager get_vars() 24160 1726853523.48145: done with get_vars() 24160 1726853523.48566: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 24160 1726853523.48693: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 24160 1726853523.50676: in VariableManager get_vars() 24160 1726853523.50680: done with get_vars() 24160 1726853523.50682: variable 'playbook_dir' from source: magic vars 24160 1726853523.50683: variable 'ansible_playbook_python' from source: magic vars 24160 1726853523.50684: variable 'ansible_config_file' from source: magic vars 24160 1726853523.50685: variable 'groups' from source: magic vars 24160 1726853523.50686: variable 'omit' from source: magic vars 24160 1726853523.50686: variable 'ansible_version' from source: magic vars 24160 1726853523.50687: variable 'ansible_check_mode' from source: magic vars 24160 1726853523.50688: variable 'ansible_diff_mode' from source: magic vars 24160 1726853523.50689: variable 'ansible_forks' from source: magic vars 24160 1726853523.50689: variable 'ansible_inventory_sources' from source: magic vars 24160 1726853523.50690: variable 'ansible_skip_tags' from source: magic vars 24160 1726853523.50691: variable 'ansible_limit' from source: magic vars 24160 1726853523.50691: variable 'ansible_run_tags' from source: magic vars 24160 1726853523.50692: variable 'ansible_verbosity' from source: magic vars 24160 1726853523.50724: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml 24160 1726853523.50811: in VariableManager get_vars() 24160 1726853523.50822: done with get_vars() 24160 1726853523.50862: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 24160 1726853523.50968: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 24160 1726853523.51041: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 24160 1726853523.51498: in VariableManager get_vars() 24160 1726853523.51514: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 24160 1726853523.53034: in VariableManager get_vars() 24160 1726853523.53038: done with get_vars() 24160 1726853523.53040: variable 'playbook_dir' from source: magic vars 24160 1726853523.53041: variable 'ansible_playbook_python' from source: magic vars 24160 1726853523.53041: variable 'ansible_config_file' from source: magic vars 24160 1726853523.53042: variable 'groups' from source: magic vars 24160 1726853523.53043: variable 'omit' from source: magic vars 24160 1726853523.53043: variable 'ansible_version' from source: magic vars 24160 1726853523.53044: variable 'ansible_check_mode' from source: magic vars 24160 1726853523.53045: variable 'ansible_diff_mode' from source: magic vars 24160 1726853523.53046: variable 'ansible_forks' from source: magic vars 24160 1726853523.53046: variable 'ansible_inventory_sources' from source: magic vars 24160 1726853523.53047: variable 'ansible_skip_tags' from source: magic vars 24160 1726853523.53048: variable 'ansible_limit' from source: magic vars 24160 1726853523.53048: variable 'ansible_run_tags' from source: magic vars 24160 1726853523.53049: variable 'ansible_verbosity' from source: magic vars 24160 1726853523.53088: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml 24160 1726853523.53148: in VariableManager get_vars() 24160 1726853523.53161: done with get_vars() 24160 1726853523.53200: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 24160 1726853523.53310: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 24160 1726853523.55084: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 24160 1726853523.55466: in VariableManager get_vars() 24160 1726853523.55487: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 24160 1726853523.57282: in VariableManager get_vars() 24160 1726853523.57295: done with get_vars() 24160 1726853523.57331: in VariableManager get_vars() 24160 1726853523.57361: done with get_vars() 24160 1726853523.57402: in VariableManager get_vars() 24160 1726853523.57427: done with get_vars() 24160 1726853523.57466: in VariableManager get_vars() 24160 1726853523.57480: done with get_vars() 24160 1726853523.57538: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 24160 1726853523.57551: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 24160 1726853523.57780: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 24160 1726853523.57945: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 24160 1726853523.57948: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-Qi7/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) 24160 1726853523.57984: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 24160 1726853523.58008: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 24160 1726853523.58177: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 24160 1726853523.58237: Loaded config def from plugin (callback/default) 24160 1726853523.58240: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 24160 1726853523.59474: Loaded config def from plugin (callback/junit) 24160 1726853523.59476: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 24160 1726853523.59518: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 24160 1726853523.59586: Loaded config def from plugin (callback/minimal) 24160 1726853523.59589: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 24160 1726853523.59626: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 24160 1726853523.59688: Loaded config def from plugin (callback/tree) 24160 1726853523.59690: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 24160 1726853523.59816: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 24160 1726853523.59819: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-Qi7/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_ipv6_disabled_nm.yml ******************************************* 5 plays in /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_disabled_nm.yml 24160 1726853523.59844: in VariableManager get_vars() 24160 1726853523.59857: done with get_vars() 24160 1726853523.59862: in VariableManager get_vars() 24160 1726853523.59869: done with get_vars() 24160 1726853523.59874: variable 'omit' from source: magic vars 24160 1726853523.59904: in VariableManager get_vars() 24160 1726853523.59916: done with get_vars() 24160 1726853523.59932: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_ipv6_disabled.yml' with nm as provider] **** 24160 1726853523.60492: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 24160 1726853523.60566: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 24160 1726853523.60598: getting the remaining hosts for this loop 24160 1726853523.60600: done getting the remaining hosts for this loop 24160 1726853523.60602: getting the next task for host managed_node1 24160 1726853523.60606: done getting next task for host managed_node1 24160 1726853523.60608: ^ task is: TASK: Gathering Facts 24160 1726853523.60609: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853523.60612: getting variables 24160 1726853523.60613: in VariableManager get_vars() 24160 1726853523.60623: Calling all_inventory to load vars for managed_node1 24160 1726853523.60625: Calling groups_inventory to load vars for managed_node1 24160 1726853523.60628: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853523.60641: Calling all_plugins_play to load vars for managed_node1 24160 1726853523.60652: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853523.60696: Calling groups_plugins_play to load vars for managed_node1 24160 1726853523.60730: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853523.60786: done with get_vars() 24160 1726853523.60793: done getting variables 24160 1726853523.60862: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_disabled_nm.yml:6 Friday 20 September 2024 13:32:03 -0400 (0:00:00.013) 0:00:00.013 ****** 24160 1726853523.61087: entering _queue_task() for managed_node1/gather_facts 24160 1726853523.61088: Creating lock for gather_facts 24160 1726853523.61646: worker is 1 (out of 1 available) 24160 1726853523.61658: exiting _queue_task() for managed_node1/gather_facts 24160 1726853523.61875: done queuing things up, now waiting for results queue to drain 24160 1726853523.61878: waiting for pending results... 24160 1726853523.62229: running TaskExecutor() for managed_node1/TASK: Gathering Facts 24160 1726853523.62238: in run() - task 02083763-bbaf-5676-4eb4-0000000000a3 24160 1726853523.62322: variable 'ansible_search_path' from source: unknown 24160 1726853523.62777: calling self._execute() 24160 1726853523.62780: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853523.62783: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853523.62785: variable 'omit' from source: magic vars 24160 1726853523.62787: variable 'omit' from source: magic vars 24160 1726853523.62812: variable 'omit' from source: magic vars 24160 1726853523.62853: variable 'omit' from source: magic vars 24160 1726853523.62954: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24160 1726853523.63053: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24160 1726853523.63077: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24160 1726853523.63142: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853523.63157: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853523.63203: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24160 1726853523.63239: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853523.63451: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853523.63454: Set connection var ansible_shell_executable to /bin/sh 24160 1726853523.63456: Set connection var ansible_pipelining to False 24160 1726853523.63459: Set connection var ansible_connection to ssh 24160 1726853523.63563: Set connection var ansible_shell_type to sh 24160 1726853523.63577: Set connection var ansible_module_compression to ZIP_DEFLATED 24160 1726853523.63588: Set connection var ansible_timeout to 10 24160 1726853523.63609: variable 'ansible_shell_executable' from source: unknown 24160 1726853523.63616: variable 'ansible_connection' from source: unknown 24160 1726853523.63623: variable 'ansible_module_compression' from source: unknown 24160 1726853523.63628: variable 'ansible_shell_type' from source: unknown 24160 1726853523.63634: variable 'ansible_shell_executable' from source: unknown 24160 1726853523.63640: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853523.63647: variable 'ansible_pipelining' from source: unknown 24160 1726853523.63654: variable 'ansible_timeout' from source: unknown 24160 1726853523.63664: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853523.64077: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 24160 1726853523.64081: variable 'omit' from source: magic vars 24160 1726853523.64083: starting attempt loop 24160 1726853523.64086: running the handler 24160 1726853523.64276: variable 'ansible_facts' from source: unknown 24160 1726853523.64279: _low_level_execute_command(): starting 24160 1726853523.64281: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24160 1726853523.65901: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853523.65904: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853523.65953: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853523.65966: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853523.66202: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853523.67822: stdout chunk (state=3): >>>/root <<< 24160 1726853523.67953: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853523.67969: stdout chunk (state=3): >>><<< 24160 1726853523.67991: stderr chunk (state=3): >>><<< 24160 1726853523.68297: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853523.68301: _low_level_execute_command(): starting 24160 1726853523.68304: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853523.682007-24192-259702330034691 `" && echo ansible-tmp-1726853523.682007-24192-259702330034691="` echo /root/.ansible/tmp/ansible-tmp-1726853523.682007-24192-259702330034691 `" ) && sleep 0' 24160 1726853523.69200: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853523.69212: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853523.69233: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853523.69256: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24160 1726853523.69287: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 24160 1726853523.69301: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address <<< 24160 1726853523.69339: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853523.69409: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853523.69461: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853523.69502: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853523.71407: stdout chunk (state=3): >>>ansible-tmp-1726853523.682007-24192-259702330034691=/root/.ansible/tmp/ansible-tmp-1726853523.682007-24192-259702330034691 <<< 24160 1726853523.71613: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853523.71616: stdout chunk (state=3): >>><<< 24160 1726853523.71619: stderr chunk (state=3): >>><<< 24160 1726853523.71621: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853523.682007-24192-259702330034691=/root/.ansible/tmp/ansible-tmp-1726853523.682007-24192-259702330034691 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853523.71829: variable 'ansible_module_compression' from source: unknown 24160 1726853523.71833: ANSIBALLZ: Using generic lock for ansible.legacy.setup 24160 1726853523.71836: ANSIBALLZ: Acquiring lock 24160 1726853523.71838: ANSIBALLZ: Lock acquired: 140302803944608 24160 1726853523.71840: ANSIBALLZ: Creating module 24160 1726853524.18981: ANSIBALLZ: Writing module into payload 24160 1726853524.19242: ANSIBALLZ: Writing module 24160 1726853524.19263: ANSIBALLZ: Renaming module 24160 1726853524.19269: ANSIBALLZ: Done creating module 24160 1726853524.19347: variable 'ansible_facts' from source: unknown 24160 1726853524.19356: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24160 1726853524.19365: _low_level_execute_command(): starting 24160 1726853524.19375: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 24160 1726853524.20832: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853524.20965: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853524.21096: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853524.21099: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853524.22792: stdout chunk (state=3): >>>PLATFORM <<< 24160 1726853524.22893: stdout chunk (state=3): >>>Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 24160 1726853524.23020: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853524.23055: stderr chunk (state=3): >>><<< 24160 1726853524.23143: stdout chunk (state=3): >>><<< 24160 1726853524.23237: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853524.23243 [managed_node1]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 24160 1726853524.23246: _low_level_execute_command(): starting 24160 1726853524.23248: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 24160 1726853524.23613: Sending initial data 24160 1726853524.23616: Sent initial data (1181 bytes) 24160 1726853524.24654: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853524.24668: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853524.24682: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853524.24834: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853524.24854: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853524.24987: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853524.28403: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 24160 1726853524.28827: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853524.28838: stdout chunk (state=3): >>><<< 24160 1726853524.28850: stderr chunk (state=3): >>><<< 24160 1726853524.28876: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853524.28974: variable 'ansible_facts' from source: unknown 24160 1726853524.28987: variable 'ansible_facts' from source: unknown 24160 1726853524.29008: variable 'ansible_module_compression' from source: unknown 24160 1726853524.29057: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24160jdl187cr/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 24160 1726853524.29092: variable 'ansible_facts' from source: unknown 24160 1726853524.29266: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853523.682007-24192-259702330034691/AnsiballZ_setup.py 24160 1726853524.29489: Sending initial data 24160 1726853524.29493: Sent initial data (153 bytes) 24160 1726853524.30585: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853524.30702: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853524.30709: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853524.30728: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853524.30817: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853524.32393: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 24160 1726853524.32401: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 24160 1726853524.32433: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24160 1726853524.32474: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24160 1726853524.32526: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24160jdl187cr/tmpippsvya2 /root/.ansible/tmp/ansible-tmp-1726853523.682007-24192-259702330034691/AnsiballZ_setup.py <<< 24160 1726853524.32533: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853523.682007-24192-259702330034691/AnsiballZ_setup.py" <<< 24160 1726853524.32564: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24160jdl187cr/tmpippsvya2" to remote "/root/.ansible/tmp/ansible-tmp-1726853523.682007-24192-259702330034691/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853523.682007-24192-259702330034691/AnsiballZ_setup.py" <<< 24160 1726853524.34083: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853524.34086: stdout chunk (state=3): >>><<< 24160 1726853524.34088: stderr chunk (state=3): >>><<< 24160 1726853524.34090: done transferring module to remote 24160 1726853524.34092: _low_level_execute_command(): starting 24160 1726853524.34094: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853523.682007-24192-259702330034691/ /root/.ansible/tmp/ansible-tmp-1726853523.682007-24192-259702330034691/AnsiballZ_setup.py && sleep 0' 24160 1726853524.34932: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853524.35093: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853524.35265: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853524.35324: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853524.37039: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853524.37177: stderr chunk (state=3): >>><<< 24160 1726853524.37181: stdout chunk (state=3): >>><<< 24160 1726853524.37184: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853524.37186: _low_level_execute_command(): starting 24160 1726853524.37188: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853523.682007-24192-259702330034691/AnsiballZ_setup.py && sleep 0' 24160 1726853524.37727: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853524.37737: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853524.37752: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853524.37765: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24160 1726853524.37779: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 24160 1726853524.37864: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853524.37890: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853524.37961: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853524.40081: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 24160 1726853524.40117: stdout chunk (state=3): >>>import _imp # builtin <<< 24160 1726853524.40148: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # <<< 24160 1726853524.40151: stdout chunk (state=3): >>>import '_weakref' # <<< 24160 1726853524.40231: stdout chunk (state=3): >>>import '_io' # <<< 24160 1726853524.40234: stdout chunk (state=3): >>>import 'marshal' # <<< 24160 1726853524.40256: stdout chunk (state=3): >>>import 'posix' # <<< 24160 1726853524.40287: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 24160 1726853524.40315: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 24160 1726853524.40372: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 24160 1726853524.40388: stdout chunk (state=3): >>>import '_codecs' # <<< 24160 1726853524.40410: stdout chunk (state=3): >>>import 'codecs' # <<< 24160 1726853524.40441: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 24160 1726853524.40465: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 24160 1726853524.40481: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f65109684d0> <<< 24160 1726853524.40492: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6510937b30> <<< 24160 1726853524.40508: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py <<< 24160 1726853524.40511: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 24160 1726853524.40525: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f651096aa50> <<< 24160 1726853524.40538: stdout chunk (state=3): >>>import '_signal' # <<< 24160 1726853524.40564: stdout chunk (state=3): >>>import '_abc' # <<< 24160 1726853524.40572: stdout chunk (state=3): >>>import 'abc' # <<< 24160 1726853524.40586: stdout chunk (state=3): >>>import 'io' # <<< 24160 1726853524.40613: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 24160 1726853524.40698: stdout chunk (state=3): >>>import '_collections_abc' # <<< 24160 1726853524.40725: stdout chunk (state=3): >>>import 'genericpath' # <<< 24160 1726853524.40728: stdout chunk (state=3): >>>import 'posixpath' # <<< 24160 1726853524.40750: stdout chunk (state=3): >>>import 'os' # <<< 24160 1726853524.40776: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 24160 1726853524.40790: stdout chunk (state=3): >>>Processing user site-packages <<< 24160 1726853524.40794: stdout chunk (state=3): >>>Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' <<< 24160 1726853524.40811: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 24160 1726853524.40840: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py <<< 24160 1726853524.40853: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 24160 1726853524.40863: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f651071d130> <<< 24160 1726853524.40914: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 24160 1726853524.40936: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f651071dfa0> <<< 24160 1726853524.40957: stdout chunk (state=3): >>>import 'site' # <<< 24160 1726853524.40988: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 24160 1726853524.41408: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 24160 1726853524.41418: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 24160 1726853524.41421: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 24160 1726853524.41439: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 24160 1726853524.41474: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 24160 1726853524.41505: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 24160 1726853524.41514: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 24160 1726853524.41526: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f651075bdd0> <<< 24160 1726853524.41546: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 24160 1726853524.41551: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 24160 1726853524.41576: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f651075bfe0> <<< 24160 1726853524.41599: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 24160 1726853524.41619: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 24160 1726853524.41648: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 24160 1726853524.41689: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 24160 1726853524.41709: stdout chunk (state=3): >>>import 'itertools' # <<< 24160 1726853524.41735: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py <<< 24160 1726853524.41743: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f65107937a0> <<< 24160 1726853524.41761: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 24160 1726853524.41764: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6510793e30> <<< 24160 1726853524.41788: stdout chunk (state=3): >>>import '_collections' # <<< 24160 1726853524.41824: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6510773aa0> <<< 24160 1726853524.41841: stdout chunk (state=3): >>>import '_functools' # <<< 24160 1726853524.41867: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f65107711c0> <<< 24160 1726853524.41957: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6510758f80> <<< 24160 1726853524.41980: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 24160 1726853524.42001: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 24160 1726853524.42006: stdout chunk (state=3): >>>import '_sre' # <<< 24160 1726853524.42032: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 24160 1726853524.42054: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 24160 1726853524.42074: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 24160 1726853524.42114: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f65107b3710> <<< 24160 1726853524.42119: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f65107b2330> <<< 24160 1726853524.42152: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' <<< 24160 1726853524.42157: stdout chunk (state=3): >>>import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6510772090> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f65107b0b90> <<< 24160 1726853524.42209: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 24160 1726853524.42225: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f65107e8740> <<< 24160 1726853524.42229: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6510758200> <<< 24160 1726853524.42247: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py <<< 24160 1726853524.42252: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 24160 1726853524.42284: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 24160 1726853524.42289: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f65107e8bf0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f65107e8aa0> <<< 24160 1726853524.42319: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f65107e8e90> <<< 24160 1726853524.42339: stdout chunk (state=3): >>>import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6510756d20> <<< 24160 1726853524.42364: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py <<< 24160 1726853524.42372: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 24160 1726853524.42385: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 24160 1726853524.42412: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 24160 1726853524.42427: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f65107e9580> <<< 24160 1726853524.42438: stdout chunk (state=3): >>>import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f65107e9250> import 'importlib.machinery' # <<< 24160 1726853524.42465: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 24160 1726853524.42488: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f65107ea480> <<< 24160 1726853524.42497: stdout chunk (state=3): >>>import 'importlib.util' # <<< 24160 1726853524.42511: stdout chunk (state=3): >>>import 'runpy' # <<< 24160 1726853524.42531: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 24160 1726853524.42561: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 24160 1726853524.42588: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py <<< 24160 1726853524.42601: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6510800680> <<< 24160 1726853524.42606: stdout chunk (state=3): >>>import 'errno' # <<< 24160 1726853524.42636: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 24160 1726853524.42641: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6510801d60> <<< 24160 1726853524.42665: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 24160 1726853524.42668: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 24160 1726853524.42696: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py <<< 24160 1726853524.42699: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 24160 1726853524.42709: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6510802c00> <<< 24160 1726853524.42747: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6510803260> <<< 24160 1726853524.42753: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6510802150> <<< 24160 1726853524.42779: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 24160 1726853524.42784: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 24160 1726853524.42819: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 24160 1726853524.42832: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6510803ce0> <<< 24160 1726853524.42842: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6510803410> <<< 24160 1726853524.42878: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f65107ea4b0> <<< 24160 1726853524.42898: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 24160 1726853524.42919: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 24160 1726853524.42940: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 24160 1726853524.42962: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 24160 1726853524.42991: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f65104ffbc0> <<< 24160 1726853524.43014: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 24160 1726853524.43043: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' <<< 24160 1726853524.43049: stdout chunk (state=3): >>># extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f65105286e0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6510528440> <<< 24160 1726853524.43075: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' <<< 24160 1726853524.43081: stdout chunk (state=3): >>># extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6510528710> <<< 24160 1726853524.43106: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py <<< 24160 1726853524.43111: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 24160 1726853524.43180: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 24160 1726853524.43307: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6510529040> <<< 24160 1726853524.43418: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 24160 1726853524.43421: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 24160 1726853524.43441: stdout chunk (state=3): >>>import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6510529a30> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f65105288f0> <<< 24160 1726853524.43447: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f65104fdd60> <<< 24160 1726853524.43469: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 24160 1726853524.43492: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 24160 1726853524.43514: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 24160 1726853524.43522: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 24160 1726853524.43533: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f651052ade0> <<< 24160 1726853524.43556: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6510529b50> <<< 24160 1726853524.43572: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f65107eaba0> <<< 24160 1726853524.43598: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 24160 1726853524.43652: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 24160 1726853524.43675: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 24160 1726853524.43704: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 24160 1726853524.43737: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6510557140> <<< 24160 1726853524.43783: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 24160 1726853524.43799: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 24160 1726853524.43818: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 24160 1726853524.43841: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 24160 1726853524.43879: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6510577500> <<< 24160 1726853524.43901: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 24160 1726853524.43942: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 24160 1726853524.43998: stdout chunk (state=3): >>>import 'ntpath' # <<< 24160 1726853524.44022: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py <<< 24160 1726853524.44032: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f65105d82c0> <<< 24160 1726853524.44044: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 24160 1726853524.44079: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 24160 1726853524.44103: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 24160 1726853524.44144: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 24160 1726853524.44227: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f65105daa20> <<< 24160 1726853524.44307: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f65105d83e0> <<< 24160 1726853524.44341: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f65105a12e0> <<< 24160 1726853524.44370: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650ff253d0> <<< 24160 1726853524.44394: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6510576300> <<< 24160 1726853524.44399: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f651052bd10> <<< 24160 1726853524.44574: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 24160 1726853524.44597: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f6510576900> <<< 24160 1726853524.44864: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_irwjxchl/ansible_ansible.legacy.setup_payload.zip' <<< 24160 1726853524.44869: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.44987: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.45018: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 24160 1726853524.45023: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 24160 1726853524.45068: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 24160 1726853524.45146: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 24160 1726853524.45177: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650ff8b0b0> <<< 24160 1726853524.45189: stdout chunk (state=3): >>>import '_typing' # <<< 24160 1726853524.45374: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650ff69fa0> <<< 24160 1726853524.45379: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650ff69130> # zipimport: zlib available <<< 24160 1726853524.45412: stdout chunk (state=3): >>>import 'ansible' # <<< 24160 1726853524.45420: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.45444: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.45449: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.45476: stdout chunk (state=3): >>>import 'ansible.module_utils' # <<< 24160 1726853524.45482: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.46888: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.48036: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650ff88f80> <<< 24160 1726853524.48069: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py <<< 24160 1726853524.48077: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 24160 1726853524.48122: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 24160 1726853524.48126: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 24160 1726853524.48163: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f650ffba960> <<< 24160 1726853524.48200: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650ffba6f0> <<< 24160 1726853524.48229: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650ffba000> <<< 24160 1726853524.48249: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 24160 1726853524.48269: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 24160 1726853524.48307: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650ffbaa50> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650ff8bd40> <<< 24160 1726853524.48320: stdout chunk (state=3): >>>import 'atexit' # <<< 24160 1726853524.48346: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f650ffbb6b0> <<< 24160 1726853524.48388: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f650ffbb8f0> <<< 24160 1726853524.48400: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 24160 1726853524.48450: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 24160 1726853524.48463: stdout chunk (state=3): >>>import '_locale' # <<< 24160 1726853524.48512: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650ffbbe30> <<< 24160 1726853524.48520: stdout chunk (state=3): >>>import 'pwd' # <<< 24160 1726853524.48543: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 24160 1726853524.48565: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 24160 1726853524.48606: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650fe25be0> <<< 24160 1726853524.48638: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f650fe27800> <<< 24160 1726853524.48667: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 24160 1726853524.48682: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 24160 1726853524.48723: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650fe281d0> <<< 24160 1726853524.48736: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 24160 1726853524.48775: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 24160 1726853524.48783: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650fe29370> <<< 24160 1726853524.48805: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 24160 1726853524.48844: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 24160 1726853524.48866: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 24160 1726853524.48924: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650fe2be30> <<< 24160 1726853524.48967: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f650ffbbef0> <<< 24160 1726853524.48994: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650fe2a0f0> <<< 24160 1726853524.49011: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 24160 1726853524.49041: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 24160 1726853524.49069: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py <<< 24160 1726853524.49088: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 24160 1726853524.49102: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 24160 1726853524.49206: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 24160 1726853524.49232: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 24160 1726853524.49249: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650fe33ce0> <<< 24160 1726853524.49255: stdout chunk (state=3): >>>import '_tokenize' # <<< 24160 1726853524.49324: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650fe327b0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650fe32510> <<< 24160 1726853524.49350: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 24160 1726853524.49356: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 24160 1726853524.49431: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650fe32a80> <<< 24160 1726853524.49458: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650fe2a600> <<< 24160 1726853524.49491: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f650fe77fe0> <<< 24160 1726853524.49520: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650fe78110> <<< 24160 1726853524.49547: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 24160 1726853524.49562: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 24160 1726853524.49585: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 24160 1726853524.49624: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f650fe79bb0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650fe79970> <<< 24160 1726853524.49648: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 24160 1726853524.49679: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 24160 1726853524.49731: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f650fe7c110> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650fe7a2a0> <<< 24160 1726853524.49760: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 24160 1726853524.49795: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 24160 1726853524.49819: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 24160 1726853524.49828: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 24160 1726853524.49843: stdout chunk (state=3): >>>import '_string' # <<< 24160 1726853524.49883: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650fe7f860> <<< 24160 1726853524.50006: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650fe7c230> <<< 24160 1726853524.50075: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so'<<< 24160 1726853524.50080: stdout chunk (state=3): >>> import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f650fe806b0> <<< 24160 1726853524.50103: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' <<< 24160 1726853524.50108: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f650fe80890> <<< 24160 1726853524.50151: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' <<< 24160 1726853524.50160: stdout chunk (state=3): >>># extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f650fe80a10> <<< 24160 1726853524.50164: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650fe78320> <<< 24160 1726853524.50192: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 24160 1726853524.50214: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 24160 1726853524.50238: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 24160 1726853524.50268: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 24160 1726853524.50293: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 24160 1726853524.50298: stdout chunk (state=3): >>>import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f650fd0c0b0> <<< 24160 1726853524.50445: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 24160 1726853524.50459: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f650fd0d3a0> <<< 24160 1726853524.50465: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650fe82870> <<< 24160 1726853524.50503: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f650fe83c20> <<< 24160 1726853524.50506: stdout chunk (state=3): >>>import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650fe82510> # zipimport: zlib available <<< 24160 1726853524.50536: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # <<< 24160 1726853524.50561: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.50645: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.50743: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.50751: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 24160 1726853524.50762: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.50787: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 24160 1726853524.50807: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.50920: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.51041: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.51587: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.52130: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # <<< 24160 1726853524.52145: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 24160 1726853524.52173: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 24160 1726853524.52191: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 24160 1726853524.52241: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' <<< 24160 1726853524.52247: stdout chunk (state=3): >>># extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f650fd11580> <<< 24160 1726853524.52331: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py <<< 24160 1726853524.52334: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650fd122a0> <<< 24160 1726853524.52358: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f65105299d0> <<< 24160 1726853524.52412: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 24160 1726853524.52415: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.52435: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.52459: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # <<< 24160 1726853524.52464: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.52615: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.52769: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 24160 1726853524.52778: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 24160 1726853524.52795: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650fd12180> <<< 24160 1726853524.52801: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.53262: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.53706: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.53772: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.53852: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 24160 1726853524.53857: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.53901: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.53928: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 24160 1726853524.53945: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.54011: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.54097: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 24160 1726853524.54103: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.54123: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 24160 1726853524.54146: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.54186: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.54223: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 24160 1726853524.54237: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.54465: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.54695: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 24160 1726853524.54761: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 24160 1726853524.54769: stdout chunk (state=3): >>>import '_ast' # <<< 24160 1726853524.54849: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650fd13440> <<< 24160 1726853524.54852: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.54931: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.55008: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 24160 1726853524.55015: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 24160 1726853524.55020: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 24160 1726853524.55045: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.55087: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.55131: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 24160 1726853524.55135: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.55187: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.55225: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.55287: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.55347: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 24160 1726853524.55396: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 24160 1726853524.55492: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' <<< 24160 1726853524.55495: stdout chunk (state=3): >>># extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f650fd1dfa0> <<< 24160 1726853524.55529: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650fd196d0> <<< 24160 1726853524.55566: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 24160 1726853524.55633: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.55732: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 24160 1726853524.55994: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650fe06ab0> <<< 24160 1726853524.56006: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650fefe780> <<< 24160 1726853524.56149: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650fd1e2a0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650fd1e060> # destroy ansible.module_utils.distro <<< 24160 1726853524.56164: stdout chunk (state=3): >>>import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 24160 1726853524.56220: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 24160 1726853524.56223: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.56247: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # <<< 24160 1726853524.56265: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.56322: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.56394: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.56407: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.56422: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.56454: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.56498: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.56534: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.56578: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 24160 1726853524.56582: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.56655: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.56720: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.56747: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.56774: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 24160 1726853524.56807: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.56970: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.57141: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.57177: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.57257: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 24160 1726853524.57261: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 24160 1726853524.57282: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 24160 1726853524.57304: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 24160 1726853524.57323: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 24160 1726853524.57350: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650fdb20c0> <<< 24160 1726853524.57377: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 24160 1726853524.57390: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 24160 1726853524.57439: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 24160 1726853524.57461: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' <<< 24160 1726853524.57482: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650f927fb0> <<< 24160 1726853524.57518: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 24160 1726853524.57530: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f650f92c4a0> <<< 24160 1726853524.57575: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650fd9b200> <<< 24160 1726853524.57593: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650fdb2c60> <<< 24160 1726853524.57624: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650fdb0770> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650fdb0b60> <<< 24160 1726853524.57650: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 24160 1726853524.57718: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 24160 1726853524.57756: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 24160 1726853524.57802: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 24160 1726853524.57805: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' <<< 24160 1726853524.57855: stdout chunk (state=3): >>># extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f650f92f350> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650f92ec00> <<< 24160 1726853524.57868: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f650f92ede0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650f92e030> <<< 24160 1726853524.57885: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 24160 1726853524.58017: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 24160 1726853524.58031: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650f92f530> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 24160 1726853524.58065: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 24160 1726853524.58097: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f650f992060> <<< 24160 1726853524.58139: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650f92ff80> <<< 24160 1726853524.58172: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650fdb11c0> <<< 24160 1726853524.58203: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available <<< 24160 1726853524.58213: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available <<< 24160 1726853524.58272: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.58330: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 24160 1726853524.58355: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.58392: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.58464: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available <<< 24160 1726853524.58489: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system' # <<< 24160 1726853524.58513: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 24160 1726853524.58546: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available <<< 24160 1726853524.58597: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.58651: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available <<< 24160 1726853524.58697: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.58750: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available <<< 24160 1726853524.58809: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.58869: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.58921: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.58986: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # <<< 24160 1726853524.58998: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.59482: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.59909: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available <<< 24160 1726853524.59964: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.60015: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.60045: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.60089: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available <<< 24160 1726853524.60122: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.60165: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 24160 1726853524.60168: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.60212: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.60275: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 24160 1726853524.60285: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.60311: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.60347: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 24160 1726853524.60364: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.60401: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.60418: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 24160 1726853524.60431: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.60500: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.60583: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 24160 1726853524.60611: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 24160 1726853524.60632: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650f9922a0> <<< 24160 1726853524.60635: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 24160 1726853524.60658: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 24160 1726853524.60805: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650f992ea0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available <<< 24160 1726853524.60853: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.60924: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 24160 1726853524.60928: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.61018: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.61113: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available <<< 24160 1726853524.61168: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.61244: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available <<< 24160 1726853524.61297: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.61340: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 24160 1726853524.61384: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 24160 1726853524.61453: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 24160 1726853524.61511: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f650f9ce3f0> <<< 24160 1726853524.61703: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650f9be0c0> <<< 24160 1726853524.61732: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available <<< 24160 1726853524.61765: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.61824: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 24160 1726853524.61852: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.61909: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.62002: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.62104: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.62240: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # <<< 24160 1726853524.62253: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available <<< 24160 1726853524.62298: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.62332: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 24160 1726853524.62345: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.62459: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.62466: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 24160 1726853524.62521: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 24160 1726853524.62525: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f650f9e1d60> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650f9e1cd0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # <<< 24160 1726853524.62542: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.62577: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.62633: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 24160 1726853524.62637: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.62778: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.62924: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 24160 1726853524.62944: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.63025: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.63125: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.63167: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.63211: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # <<< 24160 1726853524.63228: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.63255: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.63263: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.63396: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.63550: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 24160 1726853524.63552: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.63668: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.63794: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 24160 1726853524.63796: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.63834: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.63862: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.64425: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.64947: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # <<< 24160 1726853524.64981: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.65048: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.65165: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 24160 1726853524.65169: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.65247: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.65356: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 24160 1726853524.65360: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.65551: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.65655: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 24160 1726853524.65663: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.65693: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.65706: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network' # # zipimport: zlib available <<< 24160 1726853524.65736: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.65778: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 24160 1726853524.65784: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.65879: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.65972: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.66175: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.66369: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # <<< 24160 1726853524.66381: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.aix' # <<< 24160 1726853524.66388: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.66424: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.66462: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 24160 1726853524.66468: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.66491: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.66517: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # <<< 24160 1726853524.66522: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.66592: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.66660: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 24160 1726853524.66667: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.66690: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.66718: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # <<< 24160 1726853524.66724: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.66781: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.66842: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 24160 1726853524.66847: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.66901: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.66961: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 24160 1726853524.66967: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.67224: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.67482: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 24160 1726853524.67487: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.67542: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.67600: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 24160 1726853524.67603: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.67644: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.67685: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 24160 1726853524.67688: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.67724: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.67754: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 24160 1726853524.67761: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.67799: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.67862: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # <<< 24160 1726853524.67866: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.67916: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.68066: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 24160 1726853524.68145: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available <<< 24160 1726853524.68293: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 24160 1726853524.68335: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.68498: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available <<< 24160 1726853524.68533: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # <<< 24160 1726853524.68540: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.68718: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.68907: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 24160 1726853524.68926: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.68964: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.69020: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available <<< 24160 1726853524.69154: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available <<< 24160 1726853524.69198: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.69287: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 24160 1726853524.69296: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.69370: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.69468: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 24160 1726853524.69545: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853524.70145: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py <<< 24160 1726853524.70149: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 24160 1726853524.70174: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 24160 1726853524.70193: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 24160 1726853524.70228: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' <<< 24160 1726853524.70232: stdout chunk (state=3): >>># extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f650f77a4b0> <<< 24160 1726853524.70252: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650f7787a0> <<< 24160 1726853524.70297: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650f773f80> <<< 24160 1726853524.85847: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py <<< 24160 1726853524.85863: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650f7c1190> <<< 24160 1726853524.85914: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py <<< 24160 1726853524.85934: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650f7c1f40> <<< 24160 1726853524.85995: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py <<< 24160 1726853524.86012: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' <<< 24160 1726853524.86062: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' <<< 24160 1726853524.86087: stdout chunk (state=3): >>>import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650f9d05f0> <<< 24160 1726853524.86100: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650f9d3740> <<< 24160 1726853524.86327: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 24160 1726853525.06694: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_loadavg": {"1m": 0.38916015625, "5m": 0.337890625, "15m": 0.18505859375}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCXYnrsBaUY4i0/t1QUWoZkFPnFbGncbkmF01/zUZNuwldCwqYoDxpR2K8ARraEuK9oVLyYO0alCszdGP42db6R4xfRCOhN3faeZXsneupsJk4LLpIBkq0uIokeAtcL1tPOUQQzfsQqzZzp4BmJCVrwmUW5ADnzqCgvB3gsruyTQUrEUJ9MtB5zdaQm5MXuipjeZQThTjYCB2aXv/qTdzfKAwds3CoSZ6HA5GNdi6tahsy3CRIn6VtVkvwrqjJcwo+RaRQzjh+C9AUoH2YQmfLbvog62MsnLk/5OPq5HhxO81pm/TJDsI4LXwLh1VviMOWzVvIaXuKwdmYAdgX1NU561bBzeYzi55qBKo4TcMmnOXiV+Es7dDrKjwwpQKsv5tjSqVkcO6ek3I6SI38DXFOBLZtqXOOLsO12iOReYJUWe/+cgBrz12kDCPmaHFzFFZ3+N0GQ/WiYcgqiUItIhb3xJTbPqls0czPCpCzKo57GyTmv17fpfGhBfRoGX/H1zYs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDOnt+7F+RlMaCRRXs8YyuhiHP1FmeDlj4rNB/K2mg1iP9loXXc/XjJ083xMBDu7m7uYLGB+dnmj299Y+RcAQpE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKmxoIJtw8UORlc+o+Q7Pks5ERSyhMLnl+Oo8W221WGj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-153", "ansible_nodename": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26b9e88796a7cb9ebdc2656ce384f6", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fibre_channel_wwn": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_iscsi_iqn": "", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_apparmor": {"status": "disabled"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fips": false, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::3a:e7ff:fe40:bc9f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.153"], "ansible_all_ipv6_addresses": ["fe80::3a:e7ff:fe40:bc9f"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.153", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::3a:e7ff:fe40:bc9f"]}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 44238 10.31.45.153 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 44238 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_local": {}, "ansible_is_chroot": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "32", "second": "04", "epoch": "1726853524", "epoch_int": "1726853524", "date": "2024-09-20", "time": "13:32:04", "iso8601_micro": "2024-09-20T17:32:04.745498Z", "iso8601": "2024-09-20T17:32:04Z", "iso8601_basic": "20240920T133204745498", "iso8601_basic_short": "20240920T133204", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2951, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 580, "free": 2951}, "nocache": {"free": 3290, "used": 241}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_uuid": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 691, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794680832, "block_size": 4096, "block_total": 65519099, "block_available": 63914717, "block_used": 1604382, "inode_total": 131070960, "inode_available": 131029067, "inode_used": 41893, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_lsb": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 24160 1726853525.07323: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin <<< 24160 1726853525.07327: stdout chunk (state=3): >>># restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath <<< 24160 1726853525.07522: stdout chunk (state=3): >>># cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc <<< 24160 1726853525.07792: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware <<< 24160 1726853525.07820: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 24160 1726853525.07995: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 24160 1726853525.08016: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 24160 1726853525.08250: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil <<< 24160 1726853525.08268: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 24160 1726853525.08316: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool <<< 24160 1726853525.08352: stdout chunk (state=3): >>># destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue <<< 24160 1726853525.08462: stdout chunk (state=3): >>># destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl <<< 24160 1726853525.08619: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform <<< 24160 1726853525.08727: stdout chunk (state=3): >>># cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath <<< 24160 1726853525.08743: stdout chunk (state=3): >>># cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc<<< 24160 1726853525.08765: stdout chunk (state=3): >>> # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix <<< 24160 1726853525.08890: stdout chunk (state=3): >>># cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 24160 1726853525.08944: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 24160 1726853525.08964: stdout chunk (state=3): >>># destroy _collections <<< 24160 1726853525.08993: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath <<< 24160 1726853525.09006: stdout chunk (state=3): >>># destroy re._parser # destroy tokenize <<< 24160 1726853525.09110: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 24160 1726853525.09276: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib <<< 24160 1726853525.09295: stdout chunk (state=3): >>># destroy _operator # destroy _sre # destroy _string # destroy re <<< 24160 1726853525.09318: stdout chunk (state=3): >>># destroy itertools <<< 24160 1726853525.09332: stdout chunk (state=3): >>># destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 24160 1726853525.09765: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853525.09781: stderr chunk (state=3): >>>Shared connection to 10.31.45.153 closed. <<< 24160 1726853525.09838: stderr chunk (state=3): >>><<< 24160 1726853525.09847: stdout chunk (state=3): >>><<< 24160 1726853525.10425: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f65109684d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6510937b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f651096aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f651071d130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f651071dfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f651075bdd0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f651075bfe0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f65107937a0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6510793e30> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6510773aa0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f65107711c0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6510758f80> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f65107b3710> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f65107b2330> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6510772090> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f65107b0b90> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f65107e8740> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6510758200> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f65107e8bf0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f65107e8aa0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f65107e8e90> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6510756d20> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f65107e9580> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f65107e9250> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f65107ea480> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6510800680> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6510801d60> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6510802c00> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6510803260> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6510802150> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6510803ce0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6510803410> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f65107ea4b0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f65104ffbc0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f65105286e0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6510528440> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6510528710> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6510529040> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6510529a30> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f65105288f0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f65104fdd60> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f651052ade0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6510529b50> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f65107eaba0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6510557140> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6510577500> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f65105d82c0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f65105daa20> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f65105d83e0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f65105a12e0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650ff253d0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6510576300> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f651052bd10> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f6510576900> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_irwjxchl/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650ff8b0b0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650ff69fa0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650ff69130> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650ff88f80> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f650ffba960> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650ffba6f0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650ffba000> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650ffbaa50> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650ff8bd40> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f650ffbb6b0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f650ffbb8f0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650ffbbe30> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650fe25be0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f650fe27800> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650fe281d0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650fe29370> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650fe2be30> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f650ffbbef0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650fe2a0f0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650fe33ce0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650fe327b0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650fe32510> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650fe32a80> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650fe2a600> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f650fe77fe0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650fe78110> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f650fe79bb0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650fe79970> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f650fe7c110> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650fe7a2a0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650fe7f860> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650fe7c230> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f650fe806b0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f650fe80890> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f650fe80a10> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650fe78320> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f650fd0c0b0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f650fd0d3a0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650fe82870> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f650fe83c20> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650fe82510> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f650fd11580> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650fd122a0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f65105299d0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650fd12180> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650fd13440> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f650fd1dfa0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650fd196d0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650fe06ab0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650fefe780> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650fd1e2a0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650fd1e060> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650fdb20c0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650f927fb0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f650f92c4a0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650fd9b200> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650fdb2c60> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650fdb0770> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650fdb0b60> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f650f92f350> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650f92ec00> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f650f92ede0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650f92e030> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650f92f530> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f650f992060> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650f92ff80> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650fdb11c0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650f9922a0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650f992ea0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f650f9ce3f0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650f9be0c0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f650f9e1d60> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650f9e1cd0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f650f77a4b0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650f7787a0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650f773f80> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650f7c1190> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650f7c1f40> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650f9d05f0> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f650f9d3740> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_loadavg": {"1m": 0.38916015625, "5m": 0.337890625, "15m": 0.18505859375}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCXYnrsBaUY4i0/t1QUWoZkFPnFbGncbkmF01/zUZNuwldCwqYoDxpR2K8ARraEuK9oVLyYO0alCszdGP42db6R4xfRCOhN3faeZXsneupsJk4LLpIBkq0uIokeAtcL1tPOUQQzfsQqzZzp4BmJCVrwmUW5ADnzqCgvB3gsruyTQUrEUJ9MtB5zdaQm5MXuipjeZQThTjYCB2aXv/qTdzfKAwds3CoSZ6HA5GNdi6tahsy3CRIn6VtVkvwrqjJcwo+RaRQzjh+C9AUoH2YQmfLbvog62MsnLk/5OPq5HhxO81pm/TJDsI4LXwLh1VviMOWzVvIaXuKwdmYAdgX1NU561bBzeYzi55qBKo4TcMmnOXiV+Es7dDrKjwwpQKsv5tjSqVkcO6ek3I6SI38DXFOBLZtqXOOLsO12iOReYJUWe/+cgBrz12kDCPmaHFzFFZ3+N0GQ/WiYcgqiUItIhb3xJTbPqls0czPCpCzKo57GyTmv17fpfGhBfRoGX/H1zYs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDOnt+7F+RlMaCRRXs8YyuhiHP1FmeDlj4rNB/K2mg1iP9loXXc/XjJ083xMBDu7m7uYLGB+dnmj299Y+RcAQpE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKmxoIJtw8UORlc+o+Q7Pks5ERSyhMLnl+Oo8W221WGj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-153", "ansible_nodename": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26b9e88796a7cb9ebdc2656ce384f6", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fibre_channel_wwn": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_iscsi_iqn": "", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_apparmor": {"status": "disabled"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fips": false, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::3a:e7ff:fe40:bc9f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.153"], "ansible_all_ipv6_addresses": ["fe80::3a:e7ff:fe40:bc9f"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.153", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::3a:e7ff:fe40:bc9f"]}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 44238 10.31.45.153 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 44238 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_local": {}, "ansible_is_chroot": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "32", "second": "04", "epoch": "1726853524", "epoch_int": "1726853524", "date": "2024-09-20", "time": "13:32:04", "iso8601_micro": "2024-09-20T17:32:04.745498Z", "iso8601": "2024-09-20T17:32:04Z", "iso8601_basic": "20240920T133204745498", "iso8601_basic_short": "20240920T133204", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2951, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 580, "free": 2951}, "nocache": {"free": 3290, "used": 241}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_uuid": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 691, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794680832, "block_size": 4096, "block_total": 65519099, "block_available": 63914717, "block_used": 1604382, "inode_total": 131070960, "inode_available": 131029067, "inode_used": 41893, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_lsb": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed_node1 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 24160 1726853525.13909: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853523.682007-24192-259702330034691/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24160 1726853525.13935: _low_level_execute_command(): starting 24160 1726853525.13946: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853523.682007-24192-259702330034691/ > /dev/null 2>&1 && sleep 0' 24160 1726853525.15280: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853525.15513: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853525.15582: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853525.17428: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853525.17597: stderr chunk (state=3): >>><<< 24160 1726853525.17606: stdout chunk (state=3): >>><<< 24160 1726853525.17629: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853525.17644: handler run complete 24160 1726853525.18178: variable 'ansible_facts' from source: unknown 24160 1726853525.18181: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853525.18746: variable 'ansible_facts' from source: unknown 24160 1726853525.18941: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853525.19576: attempt loop complete, returning result 24160 1726853525.19579: _execute() done 24160 1726853525.19582: dumping result to json 24160 1726853525.19584: done dumping result, returning 24160 1726853525.19586: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [02083763-bbaf-5676-4eb4-0000000000a3] 24160 1726853525.19588: sending task result for task 02083763-bbaf-5676-4eb4-0000000000a3 24160 1726853525.21025: done sending task result for task 02083763-bbaf-5676-4eb4-0000000000a3 24160 1726853525.21028: WORKER PROCESS EXITING ok: [managed_node1] 24160 1726853525.21648: no more pending results, returning what we have 24160 1726853525.21651: results queue empty 24160 1726853525.21652: checking for any_errors_fatal 24160 1726853525.21655: done checking for any_errors_fatal 24160 1726853525.21656: checking for max_fail_percentage 24160 1726853525.21657: done checking for max_fail_percentage 24160 1726853525.21658: checking to see if all hosts have failed and the running result is not ok 24160 1726853525.21659: done checking to see if all hosts have failed 24160 1726853525.21660: getting the remaining hosts for this loop 24160 1726853525.21661: done getting the remaining hosts for this loop 24160 1726853525.21665: getting the next task for host managed_node1 24160 1726853525.21670: done getting next task for host managed_node1 24160 1726853525.21673: ^ task is: TASK: meta (flush_handlers) 24160 1726853525.21675: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853525.21679: getting variables 24160 1726853525.21680: in VariableManager get_vars() 24160 1726853525.21700: Calling all_inventory to load vars for managed_node1 24160 1726853525.21702: Calling groups_inventory to load vars for managed_node1 24160 1726853525.21705: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853525.21714: Calling all_plugins_play to load vars for managed_node1 24160 1726853525.21716: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853525.21719: Calling groups_plugins_play to load vars for managed_node1 24160 1726853525.22107: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853525.22504: done with get_vars() 24160 1726853525.22515: done getting variables 24160 1726853525.22785: in VariableManager get_vars() 24160 1726853525.22795: Calling all_inventory to load vars for managed_node1 24160 1726853525.22797: Calling groups_inventory to load vars for managed_node1 24160 1726853525.22799: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853525.22804: Calling all_plugins_play to load vars for managed_node1 24160 1726853525.22806: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853525.22809: Calling groups_plugins_play to load vars for managed_node1 24160 1726853525.22944: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853525.23351: done with get_vars() 24160 1726853525.23367: done queuing things up, now waiting for results queue to drain 24160 1726853525.23370: results queue empty 24160 1726853525.23507: checking for any_errors_fatal 24160 1726853525.23511: done checking for any_errors_fatal 24160 1726853525.23511: checking for max_fail_percentage 24160 1726853525.23513: done checking for max_fail_percentage 24160 1726853525.23513: checking to see if all hosts have failed and the running result is not ok 24160 1726853525.23514: done checking to see if all hosts have failed 24160 1726853525.23515: getting the remaining hosts for this loop 24160 1726853525.23521: done getting the remaining hosts for this loop 24160 1726853525.23523: getting the next task for host managed_node1 24160 1726853525.23528: done getting next task for host managed_node1 24160 1726853525.23530: ^ task is: TASK: Include the task 'el_repo_setup.yml' 24160 1726853525.23532: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853525.23534: getting variables 24160 1726853525.23535: in VariableManager get_vars() 24160 1726853525.23543: Calling all_inventory to load vars for managed_node1 24160 1726853525.23545: Calling groups_inventory to load vars for managed_node1 24160 1726853525.23547: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853525.23557: Calling all_plugins_play to load vars for managed_node1 24160 1726853525.23560: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853525.23563: Calling groups_plugins_play to load vars for managed_node1 24160 1726853525.24183: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853525.24363: done with get_vars() 24160 1726853525.24774: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_disabled_nm.yml:11 Friday 20 September 2024 13:32:05 -0400 (0:00:01.637) 0:00:01.651 ****** 24160 1726853525.24860: entering _queue_task() for managed_node1/include_tasks 24160 1726853525.24862: Creating lock for include_tasks 24160 1726853525.25768: worker is 1 (out of 1 available) 24160 1726853525.25782: exiting _queue_task() for managed_node1/include_tasks 24160 1726853525.25791: done queuing things up, now waiting for results queue to drain 24160 1726853525.25793: waiting for pending results... 24160 1726853525.26389: running TaskExecutor() for managed_node1/TASK: Include the task 'el_repo_setup.yml' 24160 1726853525.26567: in run() - task 02083763-bbaf-5676-4eb4-000000000006 24160 1726853525.26634: variable 'ansible_search_path' from source: unknown 24160 1726853525.26726: calling self._execute() 24160 1726853525.26859: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853525.26873: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853525.27052: variable 'omit' from source: magic vars 24160 1726853525.27162: _execute() done 24160 1726853525.27177: dumping result to json 24160 1726853525.27187: done dumping result, returning 24160 1726853525.27198: done running TaskExecutor() for managed_node1/TASK: Include the task 'el_repo_setup.yml' [02083763-bbaf-5676-4eb4-000000000006] 24160 1726853525.27208: sending task result for task 02083763-bbaf-5676-4eb4-000000000006 24160 1726853525.27465: done sending task result for task 02083763-bbaf-5676-4eb4-000000000006 24160 1726853525.27468: WORKER PROCESS EXITING 24160 1726853525.27525: no more pending results, returning what we have 24160 1726853525.27529: in VariableManager get_vars() 24160 1726853525.27561: Calling all_inventory to load vars for managed_node1 24160 1726853525.27564: Calling groups_inventory to load vars for managed_node1 24160 1726853525.27567: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853525.27582: Calling all_plugins_play to load vars for managed_node1 24160 1726853525.27585: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853525.27589: Calling groups_plugins_play to load vars for managed_node1 24160 1726853525.27784: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853525.28190: done with get_vars() 24160 1726853525.28198: variable 'ansible_search_path' from source: unknown 24160 1726853525.28211: we have included files to process 24160 1726853525.28212: generating all_blocks data 24160 1726853525.28213: done generating all_blocks data 24160 1726853525.28214: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 24160 1726853525.28215: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 24160 1726853525.28217: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 24160 1726853525.29780: in VariableManager get_vars() 24160 1726853525.29796: done with get_vars() 24160 1726853525.29809: done processing included file 24160 1726853525.29811: iterating over new_blocks loaded from include file 24160 1726853525.29813: in VariableManager get_vars() 24160 1726853525.29823: done with get_vars() 24160 1726853525.29824: filtering new block on tags 24160 1726853525.29839: done filtering new block on tags 24160 1726853525.29842: in VariableManager get_vars() 24160 1726853525.29852: done with get_vars() 24160 1726853525.29856: filtering new block on tags 24160 1726853525.29873: done filtering new block on tags 24160 1726853525.29875: in VariableManager get_vars() 24160 1726853525.29886: done with get_vars() 24160 1726853525.29887: filtering new block on tags 24160 1726853525.29902: done filtering new block on tags 24160 1726853525.29904: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node1 24160 1726853525.29910: extending task lists for all hosts with included blocks 24160 1726853525.29957: done extending task lists 24160 1726853525.29959: done processing included files 24160 1726853525.29959: results queue empty 24160 1726853525.29960: checking for any_errors_fatal 24160 1726853525.29961: done checking for any_errors_fatal 24160 1726853525.29962: checking for max_fail_percentage 24160 1726853525.29963: done checking for max_fail_percentage 24160 1726853525.29964: checking to see if all hosts have failed and the running result is not ok 24160 1726853525.29964: done checking to see if all hosts have failed 24160 1726853525.29965: getting the remaining hosts for this loop 24160 1726853525.29966: done getting the remaining hosts for this loop 24160 1726853525.29968: getting the next task for host managed_node1 24160 1726853525.30177: done getting next task for host managed_node1 24160 1726853525.30180: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 24160 1726853525.30183: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853525.30185: getting variables 24160 1726853525.30186: in VariableManager get_vars() 24160 1726853525.30194: Calling all_inventory to load vars for managed_node1 24160 1726853525.30196: Calling groups_inventory to load vars for managed_node1 24160 1726853525.30198: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853525.30203: Calling all_plugins_play to load vars for managed_node1 24160 1726853525.30205: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853525.30208: Calling groups_plugins_play to load vars for managed_node1 24160 1726853525.30467: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853525.31005: done with get_vars() 24160 1726853525.31013: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Friday 20 September 2024 13:32:05 -0400 (0:00:00.062) 0:00:01.713 ****** 24160 1726853525.31083: entering _queue_task() for managed_node1/setup 24160 1726853525.31610: worker is 1 (out of 1 available) 24160 1726853525.31621: exiting _queue_task() for managed_node1/setup 24160 1726853525.31631: done queuing things up, now waiting for results queue to drain 24160 1726853525.31633: waiting for pending results... 24160 1726853525.32191: running TaskExecutor() for managed_node1/TASK: Gather the minimum subset of ansible_facts required by the network role test 24160 1726853525.32197: in run() - task 02083763-bbaf-5676-4eb4-0000000000b4 24160 1726853525.32288: variable 'ansible_search_path' from source: unknown 24160 1726853525.32292: variable 'ansible_search_path' from source: unknown 24160 1726853525.32294: calling self._execute() 24160 1726853525.32407: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853525.32422: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853525.32437: variable 'omit' from source: magic vars 24160 1726853525.33448: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24160 1726853525.37676: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24160 1726853525.37746: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24160 1726853525.37913: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24160 1726853525.37964: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24160 1726853525.38004: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24160 1726853525.38163: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853525.38312: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853525.38343: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853525.38390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853525.38577: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853525.38885: variable 'ansible_facts' from source: unknown 24160 1726853525.38957: variable 'network_test_required_facts' from source: task vars 24160 1726853525.38997: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 24160 1726853525.39003: variable 'omit' from source: magic vars 24160 1726853525.39045: variable 'omit' from source: magic vars 24160 1726853525.39285: variable 'omit' from source: magic vars 24160 1726853525.39312: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24160 1726853525.39339: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24160 1726853525.39356: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24160 1726853525.39378: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853525.39389: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853525.39529: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24160 1726853525.39533: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853525.39535: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853525.39690: Set connection var ansible_shell_executable to /bin/sh 24160 1726853525.39696: Set connection var ansible_pipelining to False 24160 1726853525.39699: Set connection var ansible_connection to ssh 24160 1726853525.39702: Set connection var ansible_shell_type to sh 24160 1726853525.39722: Set connection var ansible_module_compression to ZIP_DEFLATED 24160 1726853525.39725: Set connection var ansible_timeout to 10 24160 1726853525.39742: variable 'ansible_shell_executable' from source: unknown 24160 1726853525.39745: variable 'ansible_connection' from source: unknown 24160 1726853525.39748: variable 'ansible_module_compression' from source: unknown 24160 1726853525.39751: variable 'ansible_shell_type' from source: unknown 24160 1726853525.39753: variable 'ansible_shell_executable' from source: unknown 24160 1726853525.39755: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853525.39833: variable 'ansible_pipelining' from source: unknown 24160 1726853525.39837: variable 'ansible_timeout' from source: unknown 24160 1726853525.39839: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853525.40198: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 24160 1726853525.40211: variable 'omit' from source: magic vars 24160 1726853525.40216: starting attempt loop 24160 1726853525.40219: running the handler 24160 1726853525.40235: _low_level_execute_command(): starting 24160 1726853525.40241: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24160 1726853525.41734: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853525.41738: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853525.41742: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853525.41744: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24160 1726853525.41998: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853525.42022: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853525.43707: stdout chunk (state=3): >>>/root <<< 24160 1726853525.43803: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853525.43844: stderr chunk (state=3): >>><<< 24160 1726853525.43853: stdout chunk (state=3): >>><<< 24160 1726853525.43886: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853525.43917: _low_level_execute_command(): starting 24160 1726853525.43928: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853525.4390435-24291-273034380687236 `" && echo ansible-tmp-1726853525.4390435-24291-273034380687236="` echo /root/.ansible/tmp/ansible-tmp-1726853525.4390435-24291-273034380687236 `" ) && sleep 0' 24160 1726853525.45042: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853525.45045: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24160 1726853525.45048: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853525.45050: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853525.45052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 24160 1726853525.45057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853525.45291: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853525.47183: stdout chunk (state=3): >>>ansible-tmp-1726853525.4390435-24291-273034380687236=/root/.ansible/tmp/ansible-tmp-1726853525.4390435-24291-273034380687236 <<< 24160 1726853525.47287: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853525.47319: stderr chunk (state=3): >>><<< 24160 1726853525.47676: stdout chunk (state=3): >>><<< 24160 1726853525.47680: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853525.4390435-24291-273034380687236=/root/.ansible/tmp/ansible-tmp-1726853525.4390435-24291-273034380687236 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853525.47683: variable 'ansible_module_compression' from source: unknown 24160 1726853525.47685: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24160jdl187cr/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 24160 1726853525.47687: variable 'ansible_facts' from source: unknown 24160 1726853525.48075: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853525.4390435-24291-273034380687236/AnsiballZ_setup.py 24160 1726853525.48489: Sending initial data 24160 1726853525.48498: Sent initial data (154 bytes) 24160 1726853525.49986: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853525.50099: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853525.50122: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853525.50190: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853525.51751: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 24160 1726853525.51885: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24160 1726853525.51909: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24160 1726853525.51952: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24160jdl187cr/tmp4_19i8d0 /root/.ansible/tmp/ansible-tmp-1726853525.4390435-24291-273034380687236/AnsiballZ_setup.py <<< 24160 1726853525.51968: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853525.4390435-24291-273034380687236/AnsiballZ_setup.py" <<< 24160 1726853525.51998: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24160jdl187cr/tmp4_19i8d0" to remote "/root/.ansible/tmp/ansible-tmp-1726853525.4390435-24291-273034380687236/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853525.4390435-24291-273034380687236/AnsiballZ_setup.py" <<< 24160 1726853525.54658: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853525.54678: stderr chunk (state=3): >>><<< 24160 1726853525.54688: stdout chunk (state=3): >>><<< 24160 1726853525.54713: done transferring module to remote 24160 1726853525.54732: _low_level_execute_command(): starting 24160 1726853525.54880: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853525.4390435-24291-273034380687236/ /root/.ansible/tmp/ansible-tmp-1726853525.4390435-24291-273034380687236/AnsiballZ_setup.py && sleep 0' 24160 1726853525.56083: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853525.56130: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853525.56285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853525.56297: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853525.56311: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853525.56375: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853525.58175: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853525.58178: stdout chunk (state=3): >>><<< 24160 1726853525.58181: stderr chunk (state=3): >>><<< 24160 1726853525.58295: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853525.58299: _low_level_execute_command(): starting 24160 1726853525.58301: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853525.4390435-24291-273034380687236/AnsiballZ_setup.py && sleep 0' 24160 1726853525.59734: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853525.59738: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853525.59740: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853525.59742: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853525.59745: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853525.61865: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 24160 1726853525.62089: stdout chunk (state=3): >>>import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook <<< 24160 1726853525.62166: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 24160 1726853525.62187: stdout chunk (state=3): >>>import '_codecs' # <<< 24160 1726853525.62207: stdout chunk (state=3): >>>import 'codecs' # <<< 24160 1726853525.62231: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 24160 1726853525.62273: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a5104d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a4dfb30> <<< 24160 1726853525.62314: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a512a50> <<< 24160 1726853525.62338: stdout chunk (state=3): >>>import '_signal' # <<< 24160 1726853525.62376: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <<< 24160 1726853525.62395: stdout chunk (state=3): >>>import 'io' # <<< 24160 1726853525.62427: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 24160 1726853525.62518: stdout chunk (state=3): >>>import '_collections_abc' # <<< 24160 1726853525.62547: stdout chunk (state=3): >>>import 'genericpath' # <<< 24160 1726853525.62580: stdout chunk (state=3): >>>import 'posixpath' # <<< 24160 1726853525.62639: stdout chunk (state=3): >>>import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' <<< 24160 1726853525.62650: stdout chunk (state=3): >>>Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 24160 1726853525.62710: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 24160 1726853525.62719: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a2e5130> <<< 24160 1726853525.62820: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 24160 1726853525.62823: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a2e5fa0> <<< 24160 1726853525.62840: stdout chunk (state=3): >>>import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 24160 1726853525.63453: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 24160 1726853525.63479: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a323e90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 24160 1726853525.63482: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 24160 1726853525.63485: stdout chunk (state=3): >>>import '_operator' # <<< 24160 1726853525.63487: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a323f50> <<< 24160 1726853525.63702: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 24160 1726853525.63705: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 24160 1726853525.63734: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a35b830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a35bec0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a33bb60> import '_functools' # <<< 24160 1726853525.63746: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a339280> <<< 24160 1726853525.63834: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a321040> <<< 24160 1726853525.63869: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 24160 1726853525.63891: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # <<< 24160 1726853525.63919: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 24160 1726853525.63964: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 24160 1726853525.63967: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 24160 1726853525.64065: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a37b7d0> <<< 24160 1726853525.64069: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a37a3f0> <<< 24160 1726853525.64073: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a33a150> <<< 24160 1726853525.64075: stdout chunk (state=3): >>>import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a378c20> <<< 24160 1726853525.64145: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a3b0860> <<< 24160 1726853525.64166: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a3202c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 24160 1726853525.64454: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0a3b0d10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a3b0bc0> <<< 24160 1726853525.64458: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0a3b0f80> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a31ede0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a3b1610> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a3b12e0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 24160 1726853525.64585: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a3b2510> import 'importlib.util' # <<< 24160 1726853525.64588: stdout chunk (state=3): >>>import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 24160 1726853525.64642: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a3c8710> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0a3c9df0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py <<< 24160 1726853525.64646: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 24160 1726853525.64779: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a3cac90> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0a3cb2f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a3ca1e0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 24160 1726853525.64783: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0a3cbd70> <<< 24160 1726853525.64795: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a3cb4a0> <<< 24160 1726853525.64834: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a3b2540> <<< 24160 1726853525.64851: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 24160 1726853525.64912: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 24160 1726853525.65073: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 24160 1726853525.65078: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 24160 1726853525.65081: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0a0bfbf0> <<< 24160 1726853525.65213: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0a0e86e0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a0e8440> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0a0e8710> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 24160 1726853525.65270: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0a0e9040> <<< 24160 1726853525.65499: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0a0e99a0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a0e88f0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a0bdd90> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a0eadb0> <<< 24160 1726853525.65542: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a0e9af0> <<< 24160 1726853525.65546: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a3b2c30> <<< 24160 1726853525.65565: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 24160 1726853525.65696: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a117110> <<< 24160 1726853525.66005: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a1374a0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a198260> <<< 24160 1726853525.66012: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 24160 1726853525.66036: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 24160 1726853525.66063: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 24160 1726853525.66101: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 24160 1726853525.66320: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a19a9c0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a198380> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a161280> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde09fa5340> <<< 24160 1726853525.66347: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a1362a0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a0ebce0> <<< 24160 1726853525.66593: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fde09fa55b0> <<< 24160 1726853525.66884: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_58ez_1es/ansible_setup_payload.zip' # zipimport: zlib available <<< 24160 1726853525.66933: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.66965: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 24160 1726853525.66982: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 24160 1726853525.67247: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a00f0b0> import '_typing' # <<< 24160 1726853525.67315: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde09fedfa0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde09fed130> # zipimport: zlib available <<< 24160 1726853525.67357: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available <<< 24160 1726853525.67573: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available <<< 24160 1726853525.68809: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.69945: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a00cf80> <<< 24160 1726853525.70063: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 24160 1726853525.70088: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0a03e9c0> <<< 24160 1726853525.70105: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a03e750> <<< 24160 1726853525.70140: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a03e060> <<< 24160 1726853525.70164: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 24160 1726853525.70205: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a03e7e0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a00fd40> <<< 24160 1726853525.70225: stdout chunk (state=3): >>>import 'atexit' # <<< 24160 1726853525.70274: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0a03f740> <<< 24160 1726853525.70296: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0a03f980> <<< 24160 1726853525.70340: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 24160 1726853525.70362: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # <<< 24160 1726853525.70532: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a03fec0> <<< 24160 1726853525.70539: stdout chunk (state=3): >>>import 'pwd' # <<< 24160 1726853525.70542: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde09931d00> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 24160 1726853525.70576: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde09933920> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 24160 1726853525.70579: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 24160 1726853525.70632: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde099342f0> <<< 24160 1726853525.70635: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 24160 1726853525.70672: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 24160 1726853525.70693: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde09935490> <<< 24160 1726853525.70708: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 24160 1726853525.70737: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 24160 1726853525.70762: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 24160 1726853525.70818: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde09937f50> <<< 24160 1726853525.70892: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0a0e82f0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde09936210> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 24160 1726853525.70919: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 24160 1726853525.71076: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 24160 1726853525.71132: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 24160 1726853525.71136: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0993fe00> <<< 24160 1726853525.71138: stdout chunk (state=3): >>>import '_tokenize' # <<< 24160 1726853525.71222: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0993e8d0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0993e630> <<< 24160 1726853525.71225: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 24160 1726853525.71238: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 24160 1726853525.71301: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0993eba0> <<< 24160 1726853525.71328: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde09936720> <<< 24160 1726853525.71358: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde09983f50> <<< 24160 1726853525.71588: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde09984290> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde09985d00> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde09985ac0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 24160 1726853525.71784: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde099882c0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde099863f0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0998ba10> <<< 24160 1726853525.71875: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde099883e0> <<< 24160 1726853525.71938: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0998ccb0> <<< 24160 1726853525.71977: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0998cbf0> <<< 24160 1726853525.72088: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0998cc80> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde09984470> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 24160 1726853525.72095: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 24160 1726853525.72142: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 24160 1726853525.72149: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde09818380> <<< 24160 1726853525.72306: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 24160 1726853525.72310: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde098198b0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0998eb10> <<< 24160 1726853525.72365: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0998fe90> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0998e720> <<< 24160 1726853525.72388: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.72391: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.72407: stdout chunk (state=3): >>>import 'ansible.module_utils.compat' # <<< 24160 1726853525.72413: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.72784: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 24160 1726853525.72800: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.72904: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.73444: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.74045: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 24160 1726853525.74081: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' <<< 24160 1726853525.74145: stdout chunk (state=3): >>># extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0981da60> <<< 24160 1726853525.74166: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 24160 1726853525.74183: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0981e8d0> <<< 24160 1726853525.74192: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde09819a00> <<< 24160 1726853525.74247: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 24160 1726853525.74250: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.74276: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.74301: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # # zipimport: zlib available <<< 24160 1726853525.74678: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.74786: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0981d2b0> # zipimport: zlib available <<< 24160 1726853525.75087: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.75528: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.75599: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.75674: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 24160 1726853525.75680: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.75722: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.75758: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 24160 1726853525.75765: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.76085: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 24160 1726853525.76283: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.76894: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0981f890> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available <<< 24160 1726853525.76932: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 24160 1726853525.76936: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.76977: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.77021: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.77079: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.77144: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 24160 1726853525.77204: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 24160 1726853525.77268: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0982a090> <<< 24160 1726853525.77477: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde09827da0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available <<< 24160 1726853525.77480: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.77506: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.77592: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 24160 1726853525.77609: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 24160 1726853525.77674: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 24160 1726853525.77725: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 24160 1726853525.77760: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde099029f0> <<< 24160 1726853525.77803: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde099fe6c0> <<< 24160 1726853525.77991: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0982a120> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0998d850> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 24160 1726853525.78063: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # <<< 24160 1726853525.78066: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.78160: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.78221: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 24160 1726853525.78275: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.78383: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 24160 1726853525.78439: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available <<< 24160 1726853525.78473: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.78546: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.78564: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.78606: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 24160 1726853525.78609: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.79083: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 24160 1726853525.79090: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 24160 1726853525.79118: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 24160 1726853525.79122: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 24160 1726853525.79151: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 24160 1726853525.79163: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde098ba570> <<< 24160 1726853525.79195: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 24160 1726853525.79221: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 24160 1726853525.79296: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' <<< 24160 1726853525.79316: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde094f3f80> <<< 24160 1726853525.79541: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde09508320> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde098a6b10> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde098bb110> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde098b8c80> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde098b8920> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 24160 1726853525.79586: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 24160 1726853525.79589: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py <<< 24160 1726853525.79617: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 24160 1726853525.79620: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' <<< 24160 1726853525.79677: stdout chunk (state=3): >>># extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0950b290> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0950ab40> <<< 24160 1726853525.79981: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0950ad20> <<< 24160 1726853525.79986: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde09509fa0> <<< 24160 1726853525.79989: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 24160 1726853525.80194: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0950b440> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde09555f40> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0950bf20> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde098b8890> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available <<< 24160 1726853525.80197: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # <<< 24160 1726853525.80200: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.80256: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # <<< 24160 1726853525.80263: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.80301: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available <<< 24160 1726853525.80332: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.80473: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # <<< 24160 1726853525.80479: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.80525: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.80574: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 24160 1726853525.80581: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.80634: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.80709: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.80805: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.80814: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # <<< 24160 1726853525.80834: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.81517: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.81993: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available <<< 24160 1726853525.82002: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.82129: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available <<< 24160 1726853525.82147: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 24160 1726853525.82153: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.82195: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.82221: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 24160 1726853525.82231: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.82391: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available <<< 24160 1726853525.82469: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 24160 1726853525.82477: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 24160 1726853525.82496: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde095578f0> <<< 24160 1726853525.82576: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 24160 1726853525.82653: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 24160 1726853525.82740: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde09556cf0> import 'ansible.module_utils.facts.system.local' # <<< 24160 1726853525.82754: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 24160 1726853525.82814: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 24160 1726853525.82825: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.82911: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.83002: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 24160 1726853525.83085: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.83099: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.83147: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 24160 1726853525.83163: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.83199: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.83301: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 24160 1726853525.83366: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 24160 1726853525.83424: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde09596270> <<< 24160 1726853525.83619: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde09585f70> import 'ansible.module_utils.facts.system.python' # <<< 24160 1726853525.83680: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.83699: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.83825: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 24160 1726853525.83829: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.83855: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.83917: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.84028: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.84181: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available <<< 24160 1726853525.84287: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available <<< 24160 1726853525.84317: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.84367: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py <<< 24160 1726853525.84495: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde095a9ee0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde095a9b20> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available <<< 24160 1726853525.84518: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.84565: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available <<< 24160 1726853525.84722: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.84874: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 24160 1726853525.84893: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.84984: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.85084: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.85122: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.85254: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # <<< 24160 1726853525.85258: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.85279: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 24160 1726853525.85376: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.85528: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available <<< 24160 1726853525.85768: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.85823: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available <<< 24160 1726853525.85913: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.85919: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.86400: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.86976: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available <<< 24160 1726853525.87087: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.87131: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 24160 1726853525.87153: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.87243: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.87351: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 24160 1726853525.87354: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.87505: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.87678: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 24160 1726853525.87702: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available <<< 24160 1726853525.87749: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.87801: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available <<< 24160 1726853525.87900: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.87996: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.88195: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.88618: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 24160 1726853525.88621: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.88624: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available <<< 24160 1726853525.88638: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.88696: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 24160 1726853525.88732: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 24160 1726853525.88752: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # <<< 24160 1726853525.88774: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.88827: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.88879: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 24160 1726853525.88898: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.88947: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.89003: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 24160 1726853525.89029: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.89386: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.89535: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 24160 1726853525.89551: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.89676: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.89696: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 24160 1726853525.89804: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # <<< 24160 1726853525.89955: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available <<< 24160 1726853525.90080: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.90084: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available <<< 24160 1726853525.90109: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available <<< 24160 1726853525.90194: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.90231: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available <<< 24160 1726853525.90234: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.90262: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.90305: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.90628: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available <<< 24160 1726853525.90632: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # <<< 24160 1726853525.90635: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.90816: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.91010: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 24160 1726853525.91023: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.91065: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.91173: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available <<< 24160 1726853525.91212: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 24160 1726853525.91228: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.91694: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 24160 1726853525.91697: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853525.92180: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 24160 1726853525.92209: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 24160 1726853525.92227: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 24160 1726853525.92266: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde093a75c0> <<< 24160 1726853525.92288: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde093a5700> <<< 24160 1726853525.92323: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde093a5670> <<< 24160 1726853525.93127: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_apparmor": {"status": "disabled"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "32", "second": "05", "epoch": "1726853525", "epoch_int": "1726853525", "date": "2024-09-20", "time": "13:32:05", "iso8601_micro": "2024-09-20T17:32:05.918027Z", "iso8601": "2024-09-20T17:32:05Z", "iso8601_basic": "20240920T133205918027", "iso8601_basic_short": "20240920T133205", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_lsb": {}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_pkg_mgr": "dnf", "ansible_local": {}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCXYnrsBaUY4i0/t1QUWoZkFPnFbGncbkmF01/zUZNuwldCwqYoDxpR2K8ARraEuK9oVLyYO0alCszdGP42db6R4xfRCOhN3faeZXsneupsJk4LLpIBkq0uIokeAtcL1tPOUQQzfsQqzZzp4BmJCVrwmUW5ADnzqCgvB3gsruyTQUrEUJ9MtB5zdaQm5MXuipjeZQThTjYCB2aXv/qTdzfKAwds3CoSZ6HA5GNdi6tahsy3CRIn6VtVkvwrqjJcwo+RaRQzjh+C9AUoH2YQmfLbvog62MsnLk/5OPq5HhxO81pm/TJDsI4LXwLh1VviMOWzVvIaXuKwdmYAdgX1NU561bBzeYzi55qBKo4TcMmnOXiV+Es7dDrKjwwpQKsv5tjSqVkcO6ek3I6SI38DXFOBLZtqXOOLsO12iOReYJUWe/+cgBrz12kDCPmaHFzFFZ3+N0GQ/WiYcgqiUItIhb3xJTbPqls0czPCpCzKo57GyTmv17fpfGhBfRoGX/H1zYs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDOnt+7F+RlMaCRRXs8YyuhiHP1FmeDlj4rNB/K2mg1iP9loXXc/XjJ083xMBDu7m7uYLGB+dnmj299Y+RcAQpE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKmxoIJtw8UORlc+o+Q7Pks5ERSyhMLnl+Oo8W221WGj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-153", "ansible_nodename": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26b9e88796a7cb9ebdc2656ce384f6", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fips": false, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 44238 10.31.45.153 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 44238 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 24160 1726853525.93848: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal <<< 24160 1726853525.93927: stdout chunk (state=3): >>># cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue <<< 24160 1726853525.93968: stdout chunk (state=3): >>># cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin <<< 24160 1726853525.94090: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 24160 1726853525.94569: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath <<< 24160 1726853525.94604: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess <<< 24160 1726853525.94906: stdout chunk (state=3): >>># destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing <<< 24160 1726853525.94910: stdout chunk (state=3): >>># destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 24160 1726853525.94953: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep <<< 24160 1726853525.94993: stdout chunk (state=3): >>># cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform <<< 24160 1726853525.95036: stdout chunk (state=3): >>># cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser <<< 24160 1726853525.95298: stdout chunk (state=3): >>># cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 24160 1726853525.95324: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 24160 1726853525.95356: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 24160 1726853525.95387: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 24160 1726853525.95418: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 24160 1726853525.95526: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect <<< 24160 1726853525.95564: stdout chunk (state=3): >>># destroy time # destroy _random # destroy _weakref <<< 24160 1726853525.95601: stdout chunk (state=3): >>># destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re <<< 24160 1726853525.95630: stdout chunk (state=3): >>># destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 24160 1726853525.96037: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 24160 1726853525.96040: stdout chunk (state=3): >>><<< 24160 1726853525.96042: stderr chunk (state=3): >>><<< 24160 1726853525.96491: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a5104d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a4dfb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a512a50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a2e5130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a2e5fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a323e90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a323f50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a35b830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a35bec0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a33bb60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a339280> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a321040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a37b7d0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a37a3f0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a33a150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a378c20> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a3b0860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a3202c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0a3b0d10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a3b0bc0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0a3b0f80> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a31ede0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a3b1610> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a3b12e0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a3b2510> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a3c8710> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0a3c9df0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a3cac90> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0a3cb2f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a3ca1e0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0a3cbd70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a3cb4a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a3b2540> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0a0bfbf0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0a0e86e0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a0e8440> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0a0e8710> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0a0e9040> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0a0e99a0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a0e88f0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a0bdd90> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a0eadb0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a0e9af0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a3b2c30> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a117110> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a1374a0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a198260> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a19a9c0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a198380> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a161280> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde09fa5340> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a1362a0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a0ebce0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fde09fa55b0> # zipimport: found 103 names in '/tmp/ansible_setup_payload_58ez_1es/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a00f0b0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde09fedfa0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde09fed130> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a00cf80> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0a03e9c0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a03e750> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a03e060> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a03e7e0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a00fd40> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0a03f740> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0a03f980> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0a03fec0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde09931d00> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde09933920> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde099342f0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde09935490> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde09937f50> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0a0e82f0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde09936210> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0993fe00> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0993e8d0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0993e630> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0993eba0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde09936720> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde09983f50> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde09984290> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde09985d00> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde09985ac0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde099882c0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde099863f0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0998ba10> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde099883e0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0998ccb0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0998cbf0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0998cc80> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde09984470> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde09818380> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde098198b0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0998eb10> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0998fe90> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0998e720> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0981da60> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0981e8d0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde09819a00> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0981d2b0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0981f890> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0982a090> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde09827da0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde099029f0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde099fe6c0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0982a120> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0998d850> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde098ba570> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde094f3f80> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde09508320> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde098a6b10> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde098bb110> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde098b8c80> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde098b8920> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0950b290> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0950ab40> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0950ad20> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde09509fa0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0950b440> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde09555f40> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0950bf20> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde098b8890> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde095578f0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde09556cf0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde09596270> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde09585f70> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde095a9ee0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde095a9b20> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde093a75c0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde093a5700> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde093a5670> {"ansible_facts": {"ansible_apparmor": {"status": "disabled"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "32", "second": "05", "epoch": "1726853525", "epoch_int": "1726853525", "date": "2024-09-20", "time": "13:32:05", "iso8601_micro": "2024-09-20T17:32:05.918027Z", "iso8601": "2024-09-20T17:32:05Z", "iso8601_basic": "20240920T133205918027", "iso8601_basic_short": "20240920T133205", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_lsb": {}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_pkg_mgr": "dnf", "ansible_local": {}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCXYnrsBaUY4i0/t1QUWoZkFPnFbGncbkmF01/zUZNuwldCwqYoDxpR2K8ARraEuK9oVLyYO0alCszdGP42db6R4xfRCOhN3faeZXsneupsJk4LLpIBkq0uIokeAtcL1tPOUQQzfsQqzZzp4BmJCVrwmUW5ADnzqCgvB3gsruyTQUrEUJ9MtB5zdaQm5MXuipjeZQThTjYCB2aXv/qTdzfKAwds3CoSZ6HA5GNdi6tahsy3CRIn6VtVkvwrqjJcwo+RaRQzjh+C9AUoH2YQmfLbvog62MsnLk/5OPq5HhxO81pm/TJDsI4LXwLh1VviMOWzVvIaXuKwdmYAdgX1NU561bBzeYzi55qBKo4TcMmnOXiV+Es7dDrKjwwpQKsv5tjSqVkcO6ek3I6SI38DXFOBLZtqXOOLsO12iOReYJUWe/+cgBrz12kDCPmaHFzFFZ3+N0GQ/WiYcgqiUItIhb3xJTbPqls0czPCpCzKo57GyTmv17fpfGhBfRoGX/H1zYs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDOnt+7F+RlMaCRRXs8YyuhiHP1FmeDlj4rNB/K2mg1iP9loXXc/XjJ083xMBDu7m7uYLGB+dnmj299Y+RcAQpE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKmxoIJtw8UORlc+o+Q7Pks5ERSyhMLnl+Oo8W221WGj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-153", "ansible_nodename": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26b9e88796a7cb9ebdc2656ce384f6", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fips": false, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 44238 10.31.45.153 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 44238 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 24160 1726853525.98997: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853525.4390435-24291-273034380687236/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24160 1726853525.99000: _low_level_execute_command(): starting 24160 1726853525.99002: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853525.4390435-24291-273034380687236/ > /dev/null 2>&1 && sleep 0' 24160 1726853525.99005: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853525.99007: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853525.99009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853525.99011: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24160 1726853525.99013: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853525.99073: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853525.99076: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853525.99116: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853526.00947: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853526.01127: stderr chunk (state=3): >>><<< 24160 1726853526.01130: stdout chunk (state=3): >>><<< 24160 1726853526.01177: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853526.01182: handler run complete 24160 1726853526.01377: variable 'ansible_facts' from source: unknown 24160 1726853526.01380: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853526.01592: variable 'ansible_facts' from source: unknown 24160 1726853526.01714: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853526.01821: attempt loop complete, returning result 24160 1726853526.01845: _execute() done 24160 1726853526.01856: dumping result to json 24160 1726853526.01890: done dumping result, returning 24160 1726853526.02077: done running TaskExecutor() for managed_node1/TASK: Gather the minimum subset of ansible_facts required by the network role test [02083763-bbaf-5676-4eb4-0000000000b4] 24160 1726853526.02081: sending task result for task 02083763-bbaf-5676-4eb4-0000000000b4 24160 1726853526.02289: done sending task result for task 02083763-bbaf-5676-4eb4-0000000000b4 24160 1726853526.02293: WORKER PROCESS EXITING ok: [managed_node1] 24160 1726853526.02404: no more pending results, returning what we have 24160 1726853526.02408: results queue empty 24160 1726853526.02409: checking for any_errors_fatal 24160 1726853526.02410: done checking for any_errors_fatal 24160 1726853526.02411: checking for max_fail_percentage 24160 1726853526.02413: done checking for max_fail_percentage 24160 1726853526.02413: checking to see if all hosts have failed and the running result is not ok 24160 1726853526.02414: done checking to see if all hosts have failed 24160 1726853526.02415: getting the remaining hosts for this loop 24160 1726853526.02416: done getting the remaining hosts for this loop 24160 1726853526.02420: getting the next task for host managed_node1 24160 1726853526.02430: done getting next task for host managed_node1 24160 1726853526.02433: ^ task is: TASK: Check if system is ostree 24160 1726853526.02436: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853526.02440: getting variables 24160 1726853526.02441: in VariableManager get_vars() 24160 1726853526.02874: Calling all_inventory to load vars for managed_node1 24160 1726853526.02878: Calling groups_inventory to load vars for managed_node1 24160 1726853526.02882: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853526.02894: Calling all_plugins_play to load vars for managed_node1 24160 1726853526.02897: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853526.02900: Calling groups_plugins_play to load vars for managed_node1 24160 1726853526.03396: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853526.03783: done with get_vars() 24160 1726853526.03794: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Friday 20 September 2024 13:32:06 -0400 (0:00:00.728) 0:00:02.441 ****** 24160 1726853526.03885: entering _queue_task() for managed_node1/stat 24160 1726853526.04351: worker is 1 (out of 1 available) 24160 1726853526.04366: exiting _queue_task() for managed_node1/stat 24160 1726853526.04778: done queuing things up, now waiting for results queue to drain 24160 1726853526.04780: waiting for pending results... 24160 1726853526.05019: running TaskExecutor() for managed_node1/TASK: Check if system is ostree 24160 1726853526.05029: in run() - task 02083763-bbaf-5676-4eb4-0000000000b6 24160 1726853526.05176: variable 'ansible_search_path' from source: unknown 24160 1726853526.05179: variable 'ansible_search_path' from source: unknown 24160 1726853526.05182: calling self._execute() 24160 1726853526.05442: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853526.05446: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853526.05448: variable 'omit' from source: magic vars 24160 1726853526.06322: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24160 1726853526.06790: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24160 1726853526.06923: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24160 1726853526.07233: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24160 1726853526.07236: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24160 1726853526.07298: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24160 1726853526.07327: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24160 1726853526.07479: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853526.07561: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24160 1726853526.07752: Evaluated conditional (not __network_is_ostree is defined): True 24160 1726853526.07767: variable 'omit' from source: magic vars 24160 1726853526.07818: variable 'omit' from source: magic vars 24160 1726853526.08276: variable 'omit' from source: magic vars 24160 1726853526.08280: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24160 1726853526.08284: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24160 1726853526.08295: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24160 1726853526.08537: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853526.08541: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853526.08543: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24160 1726853526.08545: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853526.08548: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853526.08923: Set connection var ansible_shell_executable to /bin/sh 24160 1726853526.09034: Set connection var ansible_pipelining to False 24160 1726853526.09038: Set connection var ansible_connection to ssh 24160 1726853526.09040: Set connection var ansible_shell_type to sh 24160 1726853526.09042: Set connection var ansible_module_compression to ZIP_DEFLATED 24160 1726853526.09044: Set connection var ansible_timeout to 10 24160 1726853526.09046: variable 'ansible_shell_executable' from source: unknown 24160 1726853526.09048: variable 'ansible_connection' from source: unknown 24160 1726853526.09050: variable 'ansible_module_compression' from source: unknown 24160 1726853526.09052: variable 'ansible_shell_type' from source: unknown 24160 1726853526.09057: variable 'ansible_shell_executable' from source: unknown 24160 1726853526.09059: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853526.09061: variable 'ansible_pipelining' from source: unknown 24160 1726853526.09063: variable 'ansible_timeout' from source: unknown 24160 1726853526.09065: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853526.09472: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 24160 1726853526.09592: variable 'omit' from source: magic vars 24160 1726853526.09602: starting attempt loop 24160 1726853526.09609: running the handler 24160 1726853526.09626: _low_level_execute_command(): starting 24160 1726853526.09637: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24160 1726853526.11225: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853526.11341: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853526.11344: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853526.11446: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853526.11659: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853526.13143: stdout chunk (state=3): >>>/root <<< 24160 1726853526.13457: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853526.13460: stdout chunk (state=3): >>><<< 24160 1726853526.13462: stderr chunk (state=3): >>><<< 24160 1726853526.13539: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853526.13550: _low_level_execute_command(): starting 24160 1726853526.13556: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853526.134861-24309-98372225485264 `" && echo ansible-tmp-1726853526.134861-24309-98372225485264="` echo /root/.ansible/tmp/ansible-tmp-1726853526.134861-24309-98372225485264 `" ) && sleep 0' 24160 1726853526.14820: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853526.14835: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration <<< 24160 1726853526.14849: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853526.15133: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853526.15136: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853526.17091: stdout chunk (state=3): >>>ansible-tmp-1726853526.134861-24309-98372225485264=/root/.ansible/tmp/ansible-tmp-1726853526.134861-24309-98372225485264 <<< 24160 1726853526.17213: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853526.17266: stderr chunk (state=3): >>><<< 24160 1726853526.17270: stdout chunk (state=3): >>><<< 24160 1726853526.17289: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853526.134861-24309-98372225485264=/root/.ansible/tmp/ansible-tmp-1726853526.134861-24309-98372225485264 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853526.17417: variable 'ansible_module_compression' from source: unknown 24160 1726853526.17642: ANSIBALLZ: Using lock for stat 24160 1726853526.17651: ANSIBALLZ: Acquiring lock 24160 1726853526.17973: ANSIBALLZ: Lock acquired: 140302799099360 24160 1726853526.17977: ANSIBALLZ: Creating module 24160 1726853526.46910: ANSIBALLZ: Writing module into payload 24160 1726853526.47019: ANSIBALLZ: Writing module 24160 1726853526.47049: ANSIBALLZ: Renaming module 24160 1726853526.47061: ANSIBALLZ: Done creating module 24160 1726853526.47096: variable 'ansible_facts' from source: unknown 24160 1726853526.47176: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853526.134861-24309-98372225485264/AnsiballZ_stat.py 24160 1726853526.47409: Sending initial data 24160 1726853526.47412: Sent initial data (151 bytes) 24160 1726853526.48065: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 24160 1726853526.48176: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853526.48205: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853526.48287: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853526.50279: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 debug2: Sending SSH2_FXP_REALPATH "." debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853526.134861-24309-98372225485264/AnsiballZ_stat.py" <<< 24160 1726853526.50288: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24160jdl187cr/tmpbzujf7bu /root/.ansible/tmp/ansible-tmp-1726853526.134861-24309-98372225485264/AnsiballZ_stat.py <<< 24160 1726853526.50291: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24160jdl187cr/tmpbzujf7bu" to remote "/root/.ansible/tmp/ansible-tmp-1726853526.134861-24309-98372225485264/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853526.134861-24309-98372225485264/AnsiballZ_stat.py" <<< 24160 1726853526.52056: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853526.52259: stderr chunk (state=3): >>><<< 24160 1726853526.52263: stdout chunk (state=3): >>><<< 24160 1726853526.52265: done transferring module to remote 24160 1726853526.52280: _low_level_execute_command(): starting 24160 1726853526.52290: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853526.134861-24309-98372225485264/ /root/.ansible/tmp/ansible-tmp-1726853526.134861-24309-98372225485264/AnsiballZ_stat.py && sleep 0' 24160 1726853526.53609: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853526.53633: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853526.53724: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853526.54217: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853526.54231: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853526.54326: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853526.54646: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853526.56252: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853526.56263: stdout chunk (state=3): >>><<< 24160 1726853526.56287: stderr chunk (state=3): >>><<< 24160 1726853526.56317: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853526.56347: _low_level_execute_command(): starting 24160 1726853526.56357: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853526.134861-24309-98372225485264/AnsiballZ_stat.py && sleep 0' 24160 1726853526.57540: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853526.57581: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853526.57660: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853526.57729: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853526.59912: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 24160 1726853526.59966: stdout chunk (state=3): >>>import _imp # builtin <<< 24160 1726853526.59990: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 24160 1726853526.60046: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 24160 1726853526.60082: stdout chunk (state=3): >>>import 'posix' # <<< 24160 1726853526.60113: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 24160 1726853526.60141: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 24160 1726853526.60202: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 24160 1726853526.60256: stdout chunk (state=3): >>>import '_codecs' # import 'codecs' # <<< 24160 1726853526.60373: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079f104d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079edfb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079f12a50> <<< 24160 1726853526.60442: stdout chunk (state=3): >>>import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # <<< 24160 1726853526.60688: stdout chunk (state=3): >>>import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079cc1130> <<< 24160 1726853526.60747: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 24160 1726853526.60796: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079cc1fa0> import 'site' # <<< 24160 1726853526.60841: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 24160 1726853526.61048: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 24160 1726853526.61198: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079cffe60> <<< 24160 1726853526.61274: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 24160 1726853526.61281: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079cfff20> <<< 24160 1726853526.61599: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079d37890> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079d37f20> import '_collections' # <<< 24160 1726853526.61602: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079d17b30> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079d15250> <<< 24160 1726853526.61650: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079cfd010> <<< 24160 1726853526.61704: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # <<< 24160 1726853526.61734: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 24160 1726853526.61812: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 24160 1726853526.61839: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079d57800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079d56450> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079d16120> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079d54cb0> <<< 24160 1726853526.61903: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079d8c860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079cfc290> <<< 24160 1726853526.62063: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 24160 1726853526.62074: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff079d8cd10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079d8cbc0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff079d8cfb0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079cfadb0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 24160 1726853526.62108: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079d8d6a0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079d8d370> import 'importlib.machinery' # <<< 24160 1726853526.62167: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079d8e5a0> import 'importlib.util' # <<< 24160 1726853526.62220: stdout chunk (state=3): >>>import 'runpy' # <<< 24160 1726853526.62324: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079da47a0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff079da5e80> <<< 24160 1726853526.62415: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079da6d20> <<< 24160 1726853526.62503: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff079da7320> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079da6270> <<< 24160 1726853526.62514: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff079da7da0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079da74d0> <<< 24160 1726853526.62553: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079d8e510> <<< 24160 1726853526.62776: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 24160 1726853526.62806: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff079b33bf0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff079b5c740> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079b5c4a0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff079b5c680> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 24160 1726853526.62850: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 24160 1726853526.62974: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff079b5cfe0> <<< 24160 1726853526.63084: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 24160 1726853526.63148: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff079b5d910> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079b5c8c0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079b31d90> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 24160 1726853526.63251: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 24160 1726853526.63278: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079b5ed20> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079b5da60> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079d8e750> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 24160 1726853526.63318: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 24160 1726853526.63356: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 24160 1726853526.63386: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 24160 1726853526.63479: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079b83080> <<< 24160 1726853526.63498: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 24160 1726853526.63515: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 24160 1726853526.63543: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079bab3e0> <<< 24160 1726853526.63576: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 24160 1726853526.63707: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 24160 1726853526.63716: stdout chunk (state=3): >>>import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079c0c200> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 24160 1726853526.63734: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 24160 1726853526.63803: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 24160 1726853526.63875: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079c0e960> <<< 24160 1726853526.63949: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079c0c320> <<< 24160 1726853526.64131: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079bd91f0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079529280> <<< 24160 1726853526.64135: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079baa210> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079b5fc50> <<< 24160 1726853526.64174: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 24160 1726853526.64177: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7ff079baa570> <<< 24160 1726853526.64356: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_6sklozdb/ansible_stat_payload.zip' # zipimport: zlib available <<< 24160 1726853526.64480: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853526.64603: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 24160 1726853526.64620: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 24160 1726853526.64678: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff07957ae70> import '_typing' # <<< 24160 1726853526.64841: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079559d90> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079558fb0> <<< 24160 1726853526.64951: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available <<< 24160 1726853526.66337: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853526.67454: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079579160> <<< 24160 1726853526.67499: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 24160 1726853526.67581: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' <<< 24160 1726853526.67800: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0795aa8d0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0795aa660> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0795a9f70> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0795aa3c0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff07957bb00> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0795ab5f0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0795ab830> <<< 24160 1726853526.67804: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 24160 1726853526.67806: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 24160 1726853526.67892: stdout chunk (state=3): >>>import '_locale' # <<< 24160 1726853526.67895: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0795abcb0> import 'pwd' # <<< 24160 1726853526.68157: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 24160 1726853526.68161: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff07940da60> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff07940f680> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff07940ff80> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079410e90> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 24160 1726853526.68254: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 24160 1726853526.68258: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079413c20> <<< 24160 1726853526.68401: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff079cffd10> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079411ee0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 24160 1726853526.68411: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 24160 1726853526.68442: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff07941bb60> import '_tokenize' # <<< 24160 1726853526.68572: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff07941a630> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff07941a390> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 24160 1726853526.68679: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff07941a900> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0794123f0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' <<< 24160 1726853526.68683: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff079463e00> <<< 24160 1726853526.68790: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079463800> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 24160 1726853526.68794: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 24160 1726853526.68987: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff079465a60> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079465820> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff079467fb0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079466120> <<< 24160 1726853526.69044: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 24160 1726853526.69047: stdout chunk (state=3): >>>import '_string' # <<< 24160 1726853526.69117: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff07946b710> <<< 24160 1726853526.69228: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0794680e0> <<< 24160 1726853526.69325: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so'<<< 24160 1726853526.69329: stdout chunk (state=3): >>> import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff07946c9e0> <<< 24160 1726853526.69377: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff07946c890> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff07946cb00> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079464170> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 24160 1726853526.69389: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 24160 1726853526.69392: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 24160 1726853526.69548: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 24160 1726853526.69553: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0794f41a0> <<< 24160 1726853526.69584: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 24160 1726853526.69593: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0794f53a0> <<< 24160 1726853526.69658: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff07946e960> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff07946fce0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff07946e5d0> # zipimport: zlib available <<< 24160 1726853526.69677: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853526.69792: stdout chunk (state=3): >>>import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 24160 1726853526.69795: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853526.69944: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 24160 1726853526.70030: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853526.70165: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853526.70886: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853526.71215: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 24160 1726853526.71241: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 24160 1726853526.71321: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0794fd6a0> <<< 24160 1726853526.71419: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0794fe3f0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0794f5820> <<< 24160 1726853526.71458: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 24160 1726853526.71463: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853526.71499: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853526.71503: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # <<< 24160 1726853526.71618: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853526.71653: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853526.71804: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 24160 1726853526.71815: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0794fe270> <<< 24160 1726853526.71877: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853526.72278: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853526.73124: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available <<< 24160 1726853526.73128: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # # zipimport: zlib available <<< 24160 1726853526.73187: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 24160 1726853526.73237: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853526.73240: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853526.73243: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 24160 1726853526.73580: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853526.73687: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 24160 1726853526.73768: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 24160 1726853526.73773: stdout chunk (state=3): >>>import '_ast' # <<< 24160 1726853526.73835: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0794ff680> # zipimport: zlib available <<< 24160 1726853526.73965: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853526.73992: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 24160 1726853526.73996: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 24160 1726853526.73998: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853526.74109: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853526.74118: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # # zipimport: zlib available <<< 24160 1726853526.74186: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853526.74275: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 24160 1726853526.74483: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff079309e80> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079307bc0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 24160 1726853526.74589: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853526.74638: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853526.74725: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 24160 1726853526.74728: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 24160 1726853526.74731: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 24160 1726853526.74835: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 24160 1726853526.74961: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0795fe8a0> <<< 24160 1726853526.74964: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0795ee570> <<< 24160 1726853526.75052: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079309fa0> <<< 24160 1726853526.75058: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0793005f0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available <<< 24160 1726853526.75181: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 24160 1726853526.75285: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853526.75504: stdout chunk (state=3): >>># zipimport: zlib available <<< 24160 1726853526.75596: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ <<< 24160 1726853526.75916: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 24160 1726853526.75941: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp <<< 24160 1726853526.75950: stdout chunk (state=3): >>># cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ <<< 24160 1726853526.75987: stdout chunk (state=3): >>># cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset <<< 24160 1726853526.76023: stdout chunk (state=3): >>># destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible <<< 24160 1726853526.76176: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast <<< 24160 1726853526.76180: stdout chunk (state=3): >>># destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 24160 1726853526.76352: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma <<< 24160 1726853526.76360: stdout chunk (state=3): >>># destroy _blake2 <<< 24160 1726853526.76585: stdout chunk (state=3): >>># destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess <<< 24160 1726853526.76634: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes <<< 24160 1726853526.76641: stdout chunk (state=3): >>># cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket <<< 24160 1726853526.76679: stdout chunk (state=3): >>># cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid <<< 24160 1726853526.76686: stdout chunk (state=3): >>># cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing<<< 24160 1726853526.76693: stdout chunk (state=3): >>> # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random <<< 24160 1726853526.76847: stdout chunk (state=3): >>># cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath <<< 24160 1726853526.76850: stdout chunk (state=3): >>># cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io <<< 24160 1726853526.76874: stdout chunk (state=3): >>># cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 24160 1726853526.76952: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 24160 1726853526.76960: stdout chunk (state=3): >>># destroy _collections <<< 24160 1726853526.77038: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 24160 1726853526.77044: stdout chunk (state=3): >>># destroy _typing <<< 24160 1726853526.77155: stdout chunk (state=3): >>># destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 24160 1726853526.77182: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8<<< 24160 1726853526.77205: stdout chunk (state=3): >>> # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings <<< 24160 1726853526.77605: stdout chunk (state=3): >>># destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 24160 1726853526.77624: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 24160 1726853526.77676: stderr chunk (state=3): >>><<< 24160 1726853526.77680: stdout chunk (state=3): >>><<< 24160 1726853526.77892: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079f104d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079edfb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079f12a50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079cc1130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079cc1fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079cffe60> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079cfff20> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079d37890> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079d37f20> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079d17b30> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079d15250> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079cfd010> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079d57800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079d56450> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079d16120> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079d54cb0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079d8c860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079cfc290> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff079d8cd10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079d8cbc0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff079d8cfb0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079cfadb0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079d8d6a0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079d8d370> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079d8e5a0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079da47a0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff079da5e80> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079da6d20> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff079da7320> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079da6270> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff079da7da0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079da74d0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079d8e510> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff079b33bf0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff079b5c740> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079b5c4a0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff079b5c680> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff079b5cfe0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff079b5d910> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079b5c8c0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079b31d90> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079b5ed20> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079b5da60> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079d8e750> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079b83080> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079bab3e0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079c0c200> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079c0e960> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079c0c320> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079bd91f0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079529280> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079baa210> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079b5fc50> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7ff079baa570> # zipimport: found 30 names in '/tmp/ansible_stat_payload_6sklozdb/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff07957ae70> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079559d90> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079558fb0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079579160> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0795aa8d0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0795aa660> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0795a9f70> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0795aa3c0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff07957bb00> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0795ab5f0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0795ab830> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0795abcb0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff07940da60> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff07940f680> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff07940ff80> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079410e90> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079413c20> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff079cffd10> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079411ee0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff07941bb60> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff07941a630> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff07941a390> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff07941a900> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0794123f0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff079463e00> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079463800> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff079465a60> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079465820> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff079467fb0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079466120> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff07946b710> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0794680e0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff07946c9e0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff07946c890> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff07946cb00> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079464170> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0794f41a0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0794f53a0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff07946e960> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff07946fce0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff07946e5d0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0794fd6a0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0794fe3f0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0794f5820> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0794fe270> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0794ff680> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff079309e80> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079307bc0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0795fe8a0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0795ee570> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff079309fa0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0793005f0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 24160 1726853526.79779: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853526.134861-24309-98372225485264/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24160 1726853526.79782: _low_level_execute_command(): starting 24160 1726853526.79785: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853526.134861-24309-98372225485264/ > /dev/null 2>&1 && sleep 0' 24160 1726853526.79787: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853526.79789: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853526.80121: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853526.80133: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853526.80196: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853526.82174: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853526.82178: stderr chunk (state=3): >>><<< 24160 1726853526.82180: stdout chunk (state=3): >>><<< 24160 1726853526.82212: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853526.82216: handler run complete 24160 1726853526.82234: attempt loop complete, returning result 24160 1726853526.82243: _execute() done 24160 1726853526.82245: dumping result to json 24160 1726853526.82247: done dumping result, returning 24160 1726853526.82385: done running TaskExecutor() for managed_node1/TASK: Check if system is ostree [02083763-bbaf-5676-4eb4-0000000000b6] 24160 1726853526.82389: sending task result for task 02083763-bbaf-5676-4eb4-0000000000b6 24160 1726853526.82827: done sending task result for task 02083763-bbaf-5676-4eb4-0000000000b6 24160 1726853526.82831: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 24160 1726853526.82951: no more pending results, returning what we have 24160 1726853526.82954: results queue empty 24160 1726853526.82958: checking for any_errors_fatal 24160 1726853526.82964: done checking for any_errors_fatal 24160 1726853526.82965: checking for max_fail_percentage 24160 1726853526.82966: done checking for max_fail_percentage 24160 1726853526.82967: checking to see if all hosts have failed and the running result is not ok 24160 1726853526.82967: done checking to see if all hosts have failed 24160 1726853526.82968: getting the remaining hosts for this loop 24160 1726853526.82969: done getting the remaining hosts for this loop 24160 1726853526.82978: getting the next task for host managed_node1 24160 1726853526.82984: done getting next task for host managed_node1 24160 1726853526.82986: ^ task is: TASK: Set flag to indicate system is ostree 24160 1726853526.82989: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853526.82992: getting variables 24160 1726853526.82994: in VariableManager get_vars() 24160 1726853526.83020: Calling all_inventory to load vars for managed_node1 24160 1726853526.83023: Calling groups_inventory to load vars for managed_node1 24160 1726853526.83026: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853526.83036: Calling all_plugins_play to load vars for managed_node1 24160 1726853526.83038: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853526.83041: Calling groups_plugins_play to load vars for managed_node1 24160 1726853526.83764: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853526.84094: done with get_vars() 24160 1726853526.84109: done getting variables 24160 1726853526.84229: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Friday 20 September 2024 13:32:06 -0400 (0:00:00.803) 0:00:03.245 ****** 24160 1726853526.84262: entering _queue_task() for managed_node1/set_fact 24160 1726853526.84264: Creating lock for set_fact 24160 1726853526.84631: worker is 1 (out of 1 available) 24160 1726853526.84648: exiting _queue_task() for managed_node1/set_fact 24160 1726853526.84667: done queuing things up, now waiting for results queue to drain 24160 1726853526.84669: waiting for pending results... 24160 1726853526.85076: running TaskExecutor() for managed_node1/TASK: Set flag to indicate system is ostree 24160 1726853526.85088: in run() - task 02083763-bbaf-5676-4eb4-0000000000b7 24160 1726853526.85092: variable 'ansible_search_path' from source: unknown 24160 1726853526.85096: variable 'ansible_search_path' from source: unknown 24160 1726853526.85169: calling self._execute() 24160 1726853526.85175: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853526.85178: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853526.85181: variable 'omit' from source: magic vars 24160 1726853526.86362: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24160 1726853526.86777: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24160 1726853526.86984: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24160 1726853526.86987: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24160 1726853526.87045: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24160 1726853526.87233: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24160 1726853526.87429: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24160 1726853526.87509: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853526.87550: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24160 1726853526.87878: Evaluated conditional (not __network_is_ostree is defined): True 24160 1726853526.87910: variable 'omit' from source: magic vars 24160 1726853526.88019: variable 'omit' from source: magic vars 24160 1726853526.88142: variable '__ostree_booted_stat' from source: set_fact 24160 1726853526.88208: variable 'omit' from source: magic vars 24160 1726853526.88239: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24160 1726853526.88297: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24160 1726853526.88309: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24160 1726853526.88331: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853526.88405: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853526.88409: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24160 1726853526.88411: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853526.88441: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853526.88580: Set connection var ansible_shell_executable to /bin/sh 24160 1726853526.88623: Set connection var ansible_pipelining to False 24160 1726853526.88626: Set connection var ansible_connection to ssh 24160 1726853526.88629: Set connection var ansible_shell_type to sh 24160 1726853526.88632: Set connection var ansible_module_compression to ZIP_DEFLATED 24160 1726853526.88634: Set connection var ansible_timeout to 10 24160 1726853526.88684: variable 'ansible_shell_executable' from source: unknown 24160 1726853526.88693: variable 'ansible_connection' from source: unknown 24160 1726853526.88716: variable 'ansible_module_compression' from source: unknown 24160 1726853526.88719: variable 'ansible_shell_type' from source: unknown 24160 1726853526.88859: variable 'ansible_shell_executable' from source: unknown 24160 1726853526.88863: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853526.88865: variable 'ansible_pipelining' from source: unknown 24160 1726853526.88867: variable 'ansible_timeout' from source: unknown 24160 1726853526.88869: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853526.88927: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 24160 1726853526.88944: variable 'omit' from source: magic vars 24160 1726853526.88961: starting attempt loop 24160 1726853526.88969: running the handler 24160 1726853526.88991: handler run complete 24160 1726853526.89008: attempt loop complete, returning result 24160 1726853526.89015: _execute() done 24160 1726853526.89023: dumping result to json 24160 1726853526.89032: done dumping result, returning 24160 1726853526.89078: done running TaskExecutor() for managed_node1/TASK: Set flag to indicate system is ostree [02083763-bbaf-5676-4eb4-0000000000b7] 24160 1726853526.89081: sending task result for task 02083763-bbaf-5676-4eb4-0000000000b7 ok: [managed_node1] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 24160 1726853526.89260: no more pending results, returning what we have 24160 1726853526.89263: results queue empty 24160 1726853526.89264: checking for any_errors_fatal 24160 1726853526.89269: done checking for any_errors_fatal 24160 1726853526.89270: checking for max_fail_percentage 24160 1726853526.89277: done checking for max_fail_percentage 24160 1726853526.89278: checking to see if all hosts have failed and the running result is not ok 24160 1726853526.89279: done checking to see if all hosts have failed 24160 1726853526.89280: getting the remaining hosts for this loop 24160 1726853526.89281: done getting the remaining hosts for this loop 24160 1726853526.89285: getting the next task for host managed_node1 24160 1726853526.89299: done getting next task for host managed_node1 24160 1726853526.89301: ^ task is: TASK: Fix CentOS6 Base repo 24160 1726853526.89358: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853526.89363: getting variables 24160 1726853526.89364: in VariableManager get_vars() 24160 1726853526.89440: Calling all_inventory to load vars for managed_node1 24160 1726853526.89444: Calling groups_inventory to load vars for managed_node1 24160 1726853526.89446: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853526.89459: Calling all_plugins_play to load vars for managed_node1 24160 1726853526.89461: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853526.89464: Calling groups_plugins_play to load vars for managed_node1 24160 1726853526.89712: done sending task result for task 02083763-bbaf-5676-4eb4-0000000000b7 24160 1726853526.89722: WORKER PROCESS EXITING 24160 1726853526.89741: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853526.89875: done with get_vars() 24160 1726853526.89882: done getting variables 24160 1726853526.89979: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Friday 20 September 2024 13:32:06 -0400 (0:00:00.057) 0:00:03.302 ****** 24160 1726853526.89998: entering _queue_task() for managed_node1/copy 24160 1726853526.90220: worker is 1 (out of 1 available) 24160 1726853526.90229: exiting _queue_task() for managed_node1/copy 24160 1726853526.90240: done queuing things up, now waiting for results queue to drain 24160 1726853526.90242: waiting for pending results... 24160 1726853526.90483: running TaskExecutor() for managed_node1/TASK: Fix CentOS6 Base repo 24160 1726853526.90580: in run() - task 02083763-bbaf-5676-4eb4-0000000000b9 24160 1726853526.90678: variable 'ansible_search_path' from source: unknown 24160 1726853526.90682: variable 'ansible_search_path' from source: unknown 24160 1726853526.90686: calling self._execute() 24160 1726853526.90733: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853526.90745: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853526.90759: variable 'omit' from source: magic vars 24160 1726853526.91677: variable 'ansible_distribution' from source: facts 24160 1726853526.91681: Evaluated conditional (ansible_distribution == 'CentOS'): True 24160 1726853526.91969: variable 'ansible_distribution_major_version' from source: facts 24160 1726853526.91987: Evaluated conditional (ansible_distribution_major_version == '6'): False 24160 1726853526.91996: when evaluation is False, skipping this task 24160 1726853526.92005: _execute() done 24160 1726853526.92013: dumping result to json 24160 1726853526.92032: done dumping result, returning 24160 1726853526.92037: done running TaskExecutor() for managed_node1/TASK: Fix CentOS6 Base repo [02083763-bbaf-5676-4eb4-0000000000b9] 24160 1726853526.92105: sending task result for task 02083763-bbaf-5676-4eb4-0000000000b9 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 24160 1726853526.92398: no more pending results, returning what we have 24160 1726853526.92401: results queue empty 24160 1726853526.92401: checking for any_errors_fatal 24160 1726853526.92405: done checking for any_errors_fatal 24160 1726853526.92406: checking for max_fail_percentage 24160 1726853526.92407: done checking for max_fail_percentage 24160 1726853526.92408: checking to see if all hosts have failed and the running result is not ok 24160 1726853526.92408: done checking to see if all hosts have failed 24160 1726853526.92409: getting the remaining hosts for this loop 24160 1726853526.92410: done getting the remaining hosts for this loop 24160 1726853526.92413: getting the next task for host managed_node1 24160 1726853526.92418: done getting next task for host managed_node1 24160 1726853526.92420: ^ task is: TASK: Include the task 'enable_epel.yml' 24160 1726853526.92423: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853526.92426: getting variables 24160 1726853526.92427: in VariableManager get_vars() 24160 1726853526.92449: Calling all_inventory to load vars for managed_node1 24160 1726853526.92452: Calling groups_inventory to load vars for managed_node1 24160 1726853526.92454: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853526.92462: Calling all_plugins_play to load vars for managed_node1 24160 1726853526.92465: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853526.92467: Calling groups_plugins_play to load vars for managed_node1 24160 1726853526.92589: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853526.92740: done with get_vars() 24160 1726853526.92747: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Friday 20 September 2024 13:32:06 -0400 (0:00:00.028) 0:00:03.330 ****** 24160 1726853526.92826: entering _queue_task() for managed_node1/include_tasks 24160 1726853526.92841: done sending task result for task 02083763-bbaf-5676-4eb4-0000000000b9 24160 1726853526.92844: WORKER PROCESS EXITING 24160 1726853526.93012: worker is 1 (out of 1 available) 24160 1726853526.93024: exiting _queue_task() for managed_node1/include_tasks 24160 1726853526.93036: done queuing things up, now waiting for results queue to drain 24160 1726853526.93038: waiting for pending results... 24160 1726853526.93213: running TaskExecutor() for managed_node1/TASK: Include the task 'enable_epel.yml' 24160 1726853526.93297: in run() - task 02083763-bbaf-5676-4eb4-0000000000ba 24160 1726853526.93329: variable 'ansible_search_path' from source: unknown 24160 1726853526.93332: variable 'ansible_search_path' from source: unknown 24160 1726853526.93345: calling self._execute() 24160 1726853526.93402: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853526.93427: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853526.93432: variable 'omit' from source: magic vars 24160 1726853526.94076: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24160 1726853526.95846: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24160 1726853526.95913: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24160 1726853526.95952: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24160 1726853526.95992: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24160 1726853526.96025: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24160 1726853526.96105: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853526.96141: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853526.96183: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853526.96237: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853526.96258: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853526.96379: variable '__network_is_ostree' from source: set_fact 24160 1726853526.96400: Evaluated conditional (not __network_is_ostree | d(false)): True 24160 1726853526.96411: _execute() done 24160 1726853526.96419: dumping result to json 24160 1726853526.96437: done dumping result, returning 24160 1726853526.96449: done running TaskExecutor() for managed_node1/TASK: Include the task 'enable_epel.yml' [02083763-bbaf-5676-4eb4-0000000000ba] 24160 1726853526.96458: sending task result for task 02083763-bbaf-5676-4eb4-0000000000ba 24160 1726853526.96587: no more pending results, returning what we have 24160 1726853526.96592: in VariableManager get_vars() 24160 1726853526.96623: Calling all_inventory to load vars for managed_node1 24160 1726853526.96625: Calling groups_inventory to load vars for managed_node1 24160 1726853526.96628: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853526.96639: Calling all_plugins_play to load vars for managed_node1 24160 1726853526.96641: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853526.96644: Calling groups_plugins_play to load vars for managed_node1 24160 1726853526.96796: done sending task result for task 02083763-bbaf-5676-4eb4-0000000000ba 24160 1726853526.96799: WORKER PROCESS EXITING 24160 1726853526.96820: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853526.96956: done with get_vars() 24160 1726853526.96962: variable 'ansible_search_path' from source: unknown 24160 1726853526.96963: variable 'ansible_search_path' from source: unknown 24160 1726853526.96996: we have included files to process 24160 1726853526.96998: generating all_blocks data 24160 1726853526.96999: done generating all_blocks data 24160 1726853526.97004: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 24160 1726853526.97005: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 24160 1726853526.97009: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 24160 1726853526.97570: done processing included file 24160 1726853526.97574: iterating over new_blocks loaded from include file 24160 1726853526.97574: in VariableManager get_vars() 24160 1726853526.97584: done with get_vars() 24160 1726853526.97585: filtering new block on tags 24160 1726853526.97598: done filtering new block on tags 24160 1726853526.97600: in VariableManager get_vars() 24160 1726853526.97606: done with get_vars() 24160 1726853526.97607: filtering new block on tags 24160 1726853526.97613: done filtering new block on tags 24160 1726853526.97614: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node1 24160 1726853526.97618: extending task lists for all hosts with included blocks 24160 1726853526.97677: done extending task lists 24160 1726853526.97678: done processing included files 24160 1726853526.97679: results queue empty 24160 1726853526.97679: checking for any_errors_fatal 24160 1726853526.97682: done checking for any_errors_fatal 24160 1726853526.97682: checking for max_fail_percentage 24160 1726853526.97683: done checking for max_fail_percentage 24160 1726853526.97683: checking to see if all hosts have failed and the running result is not ok 24160 1726853526.97684: done checking to see if all hosts have failed 24160 1726853526.97684: getting the remaining hosts for this loop 24160 1726853526.97685: done getting the remaining hosts for this loop 24160 1726853526.97686: getting the next task for host managed_node1 24160 1726853526.97690: done getting next task for host managed_node1 24160 1726853526.97691: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 24160 1726853526.97693: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853526.97694: getting variables 24160 1726853526.97695: in VariableManager get_vars() 24160 1726853526.97700: Calling all_inventory to load vars for managed_node1 24160 1726853526.97701: Calling groups_inventory to load vars for managed_node1 24160 1726853526.97703: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853526.97706: Calling all_plugins_play to load vars for managed_node1 24160 1726853526.97711: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853526.97713: Calling groups_plugins_play to load vars for managed_node1 24160 1726853526.97826: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853526.97950: done with get_vars() 24160 1726853526.97957: done getting variables 24160 1726853526.98004: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 24160 1726853526.98135: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 10] ********************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Friday 20 September 2024 13:32:06 -0400 (0:00:00.053) 0:00:03.384 ****** 24160 1726853526.98167: entering _queue_task() for managed_node1/command 24160 1726853526.98168: Creating lock for command 24160 1726853526.98381: worker is 1 (out of 1 available) 24160 1726853526.98394: exiting _queue_task() for managed_node1/command 24160 1726853526.98405: done queuing things up, now waiting for results queue to drain 24160 1726853526.98407: waiting for pending results... 24160 1726853526.98581: running TaskExecutor() for managed_node1/TASK: Create EPEL 10 24160 1726853526.98697: in run() - task 02083763-bbaf-5676-4eb4-0000000000d4 24160 1726853526.98722: variable 'ansible_search_path' from source: unknown 24160 1726853526.98732: variable 'ansible_search_path' from source: unknown 24160 1726853526.98778: calling self._execute() 24160 1726853526.98851: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853526.98860: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853526.98870: variable 'omit' from source: magic vars 24160 1726853526.99379: variable 'ansible_distribution' from source: facts 24160 1726853526.99383: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 24160 1726853526.99433: variable 'ansible_distribution_major_version' from source: facts 24160 1726853526.99439: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 24160 1726853526.99442: when evaluation is False, skipping this task 24160 1726853526.99445: _execute() done 24160 1726853526.99455: dumping result to json 24160 1726853526.99461: done dumping result, returning 24160 1726853526.99467: done running TaskExecutor() for managed_node1/TASK: Create EPEL 10 [02083763-bbaf-5676-4eb4-0000000000d4] 24160 1726853526.99475: sending task result for task 02083763-bbaf-5676-4eb4-0000000000d4 24160 1726853526.99573: done sending task result for task 02083763-bbaf-5676-4eb4-0000000000d4 24160 1726853526.99577: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 24160 1726853526.99628: no more pending results, returning what we have 24160 1726853526.99631: results queue empty 24160 1726853526.99632: checking for any_errors_fatal 24160 1726853526.99633: done checking for any_errors_fatal 24160 1726853526.99634: checking for max_fail_percentage 24160 1726853526.99636: done checking for max_fail_percentage 24160 1726853526.99637: checking to see if all hosts have failed and the running result is not ok 24160 1726853526.99637: done checking to see if all hosts have failed 24160 1726853526.99638: getting the remaining hosts for this loop 24160 1726853526.99640: done getting the remaining hosts for this loop 24160 1726853526.99643: getting the next task for host managed_node1 24160 1726853526.99649: done getting next task for host managed_node1 24160 1726853526.99651: ^ task is: TASK: Install yum-utils package 24160 1726853526.99658: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853526.99662: getting variables 24160 1726853526.99663: in VariableManager get_vars() 24160 1726853526.99695: Calling all_inventory to load vars for managed_node1 24160 1726853526.99698: Calling groups_inventory to load vars for managed_node1 24160 1726853526.99701: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853526.99713: Calling all_plugins_play to load vars for managed_node1 24160 1726853526.99716: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853526.99719: Calling groups_plugins_play to load vars for managed_node1 24160 1726853527.00118: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853527.00422: done with get_vars() 24160 1726853527.00429: done getting variables 24160 1726853527.00542: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Friday 20 September 2024 13:32:07 -0400 (0:00:00.024) 0:00:03.408 ****** 24160 1726853527.00582: entering _queue_task() for managed_node1/package 24160 1726853527.00584: Creating lock for package 24160 1726853527.00812: worker is 1 (out of 1 available) 24160 1726853527.00824: exiting _queue_task() for managed_node1/package 24160 1726853527.00835: done queuing things up, now waiting for results queue to drain 24160 1726853527.00837: waiting for pending results... 24160 1726853527.00993: running TaskExecutor() for managed_node1/TASK: Install yum-utils package 24160 1726853527.01077: in run() - task 02083763-bbaf-5676-4eb4-0000000000d5 24160 1726853527.01087: variable 'ansible_search_path' from source: unknown 24160 1726853527.01091: variable 'ansible_search_path' from source: unknown 24160 1726853527.01122: calling self._execute() 24160 1726853527.01175: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853527.01181: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853527.01189: variable 'omit' from source: magic vars 24160 1726853527.01465: variable 'ansible_distribution' from source: facts 24160 1726853527.01477: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 24160 1726853527.01560: variable 'ansible_distribution_major_version' from source: facts 24160 1726853527.01564: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 24160 1726853527.01567: when evaluation is False, skipping this task 24160 1726853527.01573: _execute() done 24160 1726853527.01575: dumping result to json 24160 1726853527.01578: done dumping result, returning 24160 1726853527.01585: done running TaskExecutor() for managed_node1/TASK: Install yum-utils package [02083763-bbaf-5676-4eb4-0000000000d5] 24160 1726853527.01589: sending task result for task 02083763-bbaf-5676-4eb4-0000000000d5 24160 1726853527.01672: done sending task result for task 02083763-bbaf-5676-4eb4-0000000000d5 24160 1726853527.01676: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 24160 1726853527.01745: no more pending results, returning what we have 24160 1726853527.01748: results queue empty 24160 1726853527.01749: checking for any_errors_fatal 24160 1726853527.01752: done checking for any_errors_fatal 24160 1726853527.01756: checking for max_fail_percentage 24160 1726853527.01757: done checking for max_fail_percentage 24160 1726853527.01758: checking to see if all hosts have failed and the running result is not ok 24160 1726853527.01758: done checking to see if all hosts have failed 24160 1726853527.01759: getting the remaining hosts for this loop 24160 1726853527.01760: done getting the remaining hosts for this loop 24160 1726853527.01763: getting the next task for host managed_node1 24160 1726853527.01767: done getting next task for host managed_node1 24160 1726853527.01769: ^ task is: TASK: Enable EPEL 7 24160 1726853527.01774: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853527.01777: getting variables 24160 1726853527.01778: in VariableManager get_vars() 24160 1726853527.01800: Calling all_inventory to load vars for managed_node1 24160 1726853527.01803: Calling groups_inventory to load vars for managed_node1 24160 1726853527.01805: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853527.01812: Calling all_plugins_play to load vars for managed_node1 24160 1726853527.01814: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853527.01815: Calling groups_plugins_play to load vars for managed_node1 24160 1726853527.01917: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853527.02028: done with get_vars() 24160 1726853527.02034: done getting variables 24160 1726853527.02075: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Friday 20 September 2024 13:32:07 -0400 (0:00:00.015) 0:00:03.423 ****** 24160 1726853527.02094: entering _queue_task() for managed_node1/command 24160 1726853527.02259: worker is 1 (out of 1 available) 24160 1726853527.02273: exiting _queue_task() for managed_node1/command 24160 1726853527.02283: done queuing things up, now waiting for results queue to drain 24160 1726853527.02285: waiting for pending results... 24160 1726853527.02421: running TaskExecutor() for managed_node1/TASK: Enable EPEL 7 24160 1726853527.02474: in run() - task 02083763-bbaf-5676-4eb4-0000000000d6 24160 1726853527.02482: variable 'ansible_search_path' from source: unknown 24160 1726853527.02486: variable 'ansible_search_path' from source: unknown 24160 1726853527.02518: calling self._execute() 24160 1726853527.02562: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853527.02566: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853527.02575: variable 'omit' from source: magic vars 24160 1726853527.02821: variable 'ansible_distribution' from source: facts 24160 1726853527.02832: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 24160 1726853527.02916: variable 'ansible_distribution_major_version' from source: facts 24160 1726853527.02919: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 24160 1726853527.02922: when evaluation is False, skipping this task 24160 1726853527.02924: _execute() done 24160 1726853527.02929: dumping result to json 24160 1726853527.02932: done dumping result, returning 24160 1726853527.02937: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 7 [02083763-bbaf-5676-4eb4-0000000000d6] 24160 1726853527.02944: sending task result for task 02083763-bbaf-5676-4eb4-0000000000d6 24160 1726853527.03021: done sending task result for task 02083763-bbaf-5676-4eb4-0000000000d6 24160 1726853527.03024: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 24160 1726853527.03094: no more pending results, returning what we have 24160 1726853527.03097: results queue empty 24160 1726853527.03098: checking for any_errors_fatal 24160 1726853527.03101: done checking for any_errors_fatal 24160 1726853527.03102: checking for max_fail_percentage 24160 1726853527.03103: done checking for max_fail_percentage 24160 1726853527.03103: checking to see if all hosts have failed and the running result is not ok 24160 1726853527.03104: done checking to see if all hosts have failed 24160 1726853527.03105: getting the remaining hosts for this loop 24160 1726853527.03106: done getting the remaining hosts for this loop 24160 1726853527.03109: getting the next task for host managed_node1 24160 1726853527.03113: done getting next task for host managed_node1 24160 1726853527.03115: ^ task is: TASK: Enable EPEL 8 24160 1726853527.03118: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853527.03120: getting variables 24160 1726853527.03121: in VariableManager get_vars() 24160 1726853527.03138: Calling all_inventory to load vars for managed_node1 24160 1726853527.03139: Calling groups_inventory to load vars for managed_node1 24160 1726853527.03141: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853527.03147: Calling all_plugins_play to load vars for managed_node1 24160 1726853527.03148: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853527.03150: Calling groups_plugins_play to load vars for managed_node1 24160 1726853527.03302: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853527.03437: done with get_vars() 24160 1726853527.03449: done getting variables 24160 1726853527.03488: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Friday 20 September 2024 13:32:07 -0400 (0:00:00.014) 0:00:03.437 ****** 24160 1726853527.03519: entering _queue_task() for managed_node1/command 24160 1726853527.03690: worker is 1 (out of 1 available) 24160 1726853527.03699: exiting _queue_task() for managed_node1/command 24160 1726853527.03708: done queuing things up, now waiting for results queue to drain 24160 1726853527.03710: waiting for pending results... 24160 1726853527.03999: running TaskExecutor() for managed_node1/TASK: Enable EPEL 8 24160 1726853527.04010: in run() - task 02083763-bbaf-5676-4eb4-0000000000d7 24160 1726853527.04023: variable 'ansible_search_path' from source: unknown 24160 1726853527.04026: variable 'ansible_search_path' from source: unknown 24160 1726853527.04056: calling self._execute() 24160 1726853527.04126: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853527.04130: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853527.04139: variable 'omit' from source: magic vars 24160 1726853527.04442: variable 'ansible_distribution' from source: facts 24160 1726853527.04508: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 24160 1726853527.04644: variable 'ansible_distribution_major_version' from source: facts 24160 1726853527.04648: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 24160 1726853527.04650: when evaluation is False, skipping this task 24160 1726853527.04652: _execute() done 24160 1726853527.04656: dumping result to json 24160 1726853527.04658: done dumping result, returning 24160 1726853527.04660: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 8 [02083763-bbaf-5676-4eb4-0000000000d7] 24160 1726853527.04662: sending task result for task 02083763-bbaf-5676-4eb4-0000000000d7 24160 1726853527.04719: done sending task result for task 02083763-bbaf-5676-4eb4-0000000000d7 24160 1726853527.04722: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 24160 1726853527.04764: no more pending results, returning what we have 24160 1726853527.04767: results queue empty 24160 1726853527.04768: checking for any_errors_fatal 24160 1726853527.04777: done checking for any_errors_fatal 24160 1726853527.04778: checking for max_fail_percentage 24160 1726853527.04779: done checking for max_fail_percentage 24160 1726853527.04780: checking to see if all hosts have failed and the running result is not ok 24160 1726853527.04780: done checking to see if all hosts have failed 24160 1726853527.04781: getting the remaining hosts for this loop 24160 1726853527.04782: done getting the remaining hosts for this loop 24160 1726853527.04785: getting the next task for host managed_node1 24160 1726853527.04792: done getting next task for host managed_node1 24160 1726853527.04795: ^ task is: TASK: Enable EPEL 6 24160 1726853527.04798: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853527.04801: getting variables 24160 1726853527.04802: in VariableManager get_vars() 24160 1726853527.04824: Calling all_inventory to load vars for managed_node1 24160 1726853527.04826: Calling groups_inventory to load vars for managed_node1 24160 1726853527.04829: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853527.04836: Calling all_plugins_play to load vars for managed_node1 24160 1726853527.04839: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853527.04842: Calling groups_plugins_play to load vars for managed_node1 24160 1726853527.05029: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853527.05241: done with get_vars() 24160 1726853527.05249: done getting variables 24160 1726853527.05317: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Friday 20 September 2024 13:32:07 -0400 (0:00:00.018) 0:00:03.456 ****** 24160 1726853527.05344: entering _queue_task() for managed_node1/copy 24160 1726853527.05560: worker is 1 (out of 1 available) 24160 1726853527.05575: exiting _queue_task() for managed_node1/copy 24160 1726853527.05586: done queuing things up, now waiting for results queue to drain 24160 1726853527.05587: waiting for pending results... 24160 1726853527.05715: running TaskExecutor() for managed_node1/TASK: Enable EPEL 6 24160 1726853527.05776: in run() - task 02083763-bbaf-5676-4eb4-0000000000d9 24160 1726853527.05786: variable 'ansible_search_path' from source: unknown 24160 1726853527.05790: variable 'ansible_search_path' from source: unknown 24160 1726853527.05814: calling self._execute() 24160 1726853527.05866: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853527.05872: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853527.05881: variable 'omit' from source: magic vars 24160 1726853527.06117: variable 'ansible_distribution' from source: facts 24160 1726853527.06127: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 24160 1726853527.06204: variable 'ansible_distribution_major_version' from source: facts 24160 1726853527.06207: Evaluated conditional (ansible_distribution_major_version == '6'): False 24160 1726853527.06210: when evaluation is False, skipping this task 24160 1726853527.06213: _execute() done 24160 1726853527.06216: dumping result to json 24160 1726853527.06221: done dumping result, returning 24160 1726853527.06226: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 6 [02083763-bbaf-5676-4eb4-0000000000d9] 24160 1726853527.06232: sending task result for task 02083763-bbaf-5676-4eb4-0000000000d9 24160 1726853527.06314: done sending task result for task 02083763-bbaf-5676-4eb4-0000000000d9 24160 1726853527.06317: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 24160 1726853527.06385: no more pending results, returning what we have 24160 1726853527.06387: results queue empty 24160 1726853527.06388: checking for any_errors_fatal 24160 1726853527.06391: done checking for any_errors_fatal 24160 1726853527.06391: checking for max_fail_percentage 24160 1726853527.06393: done checking for max_fail_percentage 24160 1726853527.06393: checking to see if all hosts have failed and the running result is not ok 24160 1726853527.06394: done checking to see if all hosts have failed 24160 1726853527.06395: getting the remaining hosts for this loop 24160 1726853527.06396: done getting the remaining hosts for this loop 24160 1726853527.06399: getting the next task for host managed_node1 24160 1726853527.06403: done getting next task for host managed_node1 24160 1726853527.06405: ^ task is: TASK: Set network provider to 'nm' 24160 1726853527.06406: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853527.06409: getting variables 24160 1726853527.06410: in VariableManager get_vars() 24160 1726853527.06428: Calling all_inventory to load vars for managed_node1 24160 1726853527.06430: Calling groups_inventory to load vars for managed_node1 24160 1726853527.06433: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853527.06440: Calling all_plugins_play to load vars for managed_node1 24160 1726853527.06441: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853527.06443: Calling groups_plugins_play to load vars for managed_node1 24160 1726853527.06688: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853527.06794: done with get_vars() 24160 1726853527.06800: done getting variables 24160 1726853527.06834: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_disabled_nm.yml:13 Friday 20 September 2024 13:32:07 -0400 (0:00:00.015) 0:00:03.471 ****** 24160 1726853527.06849: entering _queue_task() for managed_node1/set_fact 24160 1726853527.07009: worker is 1 (out of 1 available) 24160 1726853527.07020: exiting _queue_task() for managed_node1/set_fact 24160 1726853527.07031: done queuing things up, now waiting for results queue to drain 24160 1726853527.07033: waiting for pending results... 24160 1726853527.07175: running TaskExecutor() for managed_node1/TASK: Set network provider to 'nm' 24160 1726853527.07217: in run() - task 02083763-bbaf-5676-4eb4-000000000007 24160 1726853527.07228: variable 'ansible_search_path' from source: unknown 24160 1726853527.07257: calling self._execute() 24160 1726853527.07307: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853527.07312: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853527.07319: variable 'omit' from source: magic vars 24160 1726853527.07393: variable 'omit' from source: magic vars 24160 1726853527.07414: variable 'omit' from source: magic vars 24160 1726853527.07438: variable 'omit' from source: magic vars 24160 1726853527.07469: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24160 1726853527.07499: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24160 1726853527.07516: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24160 1726853527.07530: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853527.07538: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853527.07561: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24160 1726853527.07564: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853527.07567: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853527.07636: Set connection var ansible_shell_executable to /bin/sh 24160 1726853527.07642: Set connection var ansible_pipelining to False 24160 1726853527.07645: Set connection var ansible_connection to ssh 24160 1726853527.07647: Set connection var ansible_shell_type to sh 24160 1726853527.07656: Set connection var ansible_module_compression to ZIP_DEFLATED 24160 1726853527.07662: Set connection var ansible_timeout to 10 24160 1726853527.07678: variable 'ansible_shell_executable' from source: unknown 24160 1726853527.07681: variable 'ansible_connection' from source: unknown 24160 1726853527.07684: variable 'ansible_module_compression' from source: unknown 24160 1726853527.07687: variable 'ansible_shell_type' from source: unknown 24160 1726853527.07689: variable 'ansible_shell_executable' from source: unknown 24160 1726853527.07692: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853527.07694: variable 'ansible_pipelining' from source: unknown 24160 1726853527.07697: variable 'ansible_timeout' from source: unknown 24160 1726853527.07704: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853527.07799: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 24160 1726853527.07807: variable 'omit' from source: magic vars 24160 1726853527.07816: starting attempt loop 24160 1726853527.07818: running the handler 24160 1726853527.07829: handler run complete 24160 1726853527.07833: attempt loop complete, returning result 24160 1726853527.07835: _execute() done 24160 1726853527.07839: dumping result to json 24160 1726853527.07844: done dumping result, returning 24160 1726853527.07849: done running TaskExecutor() for managed_node1/TASK: Set network provider to 'nm' [02083763-bbaf-5676-4eb4-000000000007] 24160 1726853527.07856: sending task result for task 02083763-bbaf-5676-4eb4-000000000007 24160 1726853527.07933: done sending task result for task 02083763-bbaf-5676-4eb4-000000000007 24160 1726853527.07936: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 24160 1726853527.07994: no more pending results, returning what we have 24160 1726853527.07996: results queue empty 24160 1726853527.07997: checking for any_errors_fatal 24160 1726853527.08001: done checking for any_errors_fatal 24160 1726853527.08002: checking for max_fail_percentage 24160 1726853527.08003: done checking for max_fail_percentage 24160 1726853527.08004: checking to see if all hosts have failed and the running result is not ok 24160 1726853527.08004: done checking to see if all hosts have failed 24160 1726853527.08005: getting the remaining hosts for this loop 24160 1726853527.08006: done getting the remaining hosts for this loop 24160 1726853527.08014: getting the next task for host managed_node1 24160 1726853527.08019: done getting next task for host managed_node1 24160 1726853527.08021: ^ task is: TASK: meta (flush_handlers) 24160 1726853527.08022: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853527.08026: getting variables 24160 1726853527.08027: in VariableManager get_vars() 24160 1726853527.08051: Calling all_inventory to load vars for managed_node1 24160 1726853527.08056: Calling groups_inventory to load vars for managed_node1 24160 1726853527.08060: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853527.08069: Calling all_plugins_play to load vars for managed_node1 24160 1726853527.08073: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853527.08076: Calling groups_plugins_play to load vars for managed_node1 24160 1726853527.08231: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853527.08421: done with get_vars() 24160 1726853527.08430: done getting variables 24160 1726853527.08491: in VariableManager get_vars() 24160 1726853527.08499: Calling all_inventory to load vars for managed_node1 24160 1726853527.08501: Calling groups_inventory to load vars for managed_node1 24160 1726853527.08503: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853527.08507: Calling all_plugins_play to load vars for managed_node1 24160 1726853527.08510: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853527.08512: Calling groups_plugins_play to load vars for managed_node1 24160 1726853527.08669: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853527.08860: done with get_vars() 24160 1726853527.08875: done queuing things up, now waiting for results queue to drain 24160 1726853527.08877: results queue empty 24160 1726853527.08877: checking for any_errors_fatal 24160 1726853527.08879: done checking for any_errors_fatal 24160 1726853527.08880: checking for max_fail_percentage 24160 1726853527.08881: done checking for max_fail_percentage 24160 1726853527.08881: checking to see if all hosts have failed and the running result is not ok 24160 1726853527.08882: done checking to see if all hosts have failed 24160 1726853527.08883: getting the remaining hosts for this loop 24160 1726853527.08883: done getting the remaining hosts for this loop 24160 1726853527.08886: getting the next task for host managed_node1 24160 1726853527.08890: done getting next task for host managed_node1 24160 1726853527.08891: ^ task is: TASK: meta (flush_handlers) 24160 1726853527.08892: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853527.08900: getting variables 24160 1726853527.08901: in VariableManager get_vars() 24160 1726853527.08909: Calling all_inventory to load vars for managed_node1 24160 1726853527.08911: Calling groups_inventory to load vars for managed_node1 24160 1726853527.08913: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853527.08918: Calling all_plugins_play to load vars for managed_node1 24160 1726853527.08920: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853527.08922: Calling groups_plugins_play to load vars for managed_node1 24160 1726853527.09022: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853527.09129: done with get_vars() 24160 1726853527.09134: done getting variables 24160 1726853527.09163: in VariableManager get_vars() 24160 1726853527.09169: Calling all_inventory to load vars for managed_node1 24160 1726853527.09172: Calling groups_inventory to load vars for managed_node1 24160 1726853527.09174: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853527.09176: Calling all_plugins_play to load vars for managed_node1 24160 1726853527.09178: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853527.09179: Calling groups_plugins_play to load vars for managed_node1 24160 1726853527.09265: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853527.09387: done with get_vars() 24160 1726853527.09394: done queuing things up, now waiting for results queue to drain 24160 1726853527.09396: results queue empty 24160 1726853527.09396: checking for any_errors_fatal 24160 1726853527.09397: done checking for any_errors_fatal 24160 1726853527.09398: checking for max_fail_percentage 24160 1726853527.09399: done checking for max_fail_percentage 24160 1726853527.09399: checking to see if all hosts have failed and the running result is not ok 24160 1726853527.09400: done checking to see if all hosts have failed 24160 1726853527.09400: getting the remaining hosts for this loop 24160 1726853527.09401: done getting the remaining hosts for this loop 24160 1726853527.09402: getting the next task for host managed_node1 24160 1726853527.09404: done getting next task for host managed_node1 24160 1726853527.09405: ^ task is: None 24160 1726853527.09405: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853527.09406: done queuing things up, now waiting for results queue to drain 24160 1726853527.09407: results queue empty 24160 1726853527.09407: checking for any_errors_fatal 24160 1726853527.09407: done checking for any_errors_fatal 24160 1726853527.09408: checking for max_fail_percentage 24160 1726853527.09408: done checking for max_fail_percentage 24160 1726853527.09409: checking to see if all hosts have failed and the running result is not ok 24160 1726853527.09409: done checking to see if all hosts have failed 24160 1726853527.09410: getting the next task for host managed_node1 24160 1726853527.09412: done getting next task for host managed_node1 24160 1726853527.09412: ^ task is: None 24160 1726853527.09413: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853527.09447: in VariableManager get_vars() 24160 1726853527.09463: done with get_vars() 24160 1726853527.09467: in VariableManager get_vars() 24160 1726853527.09477: done with get_vars() 24160 1726853527.09480: variable 'omit' from source: magic vars 24160 1726853527.09499: in VariableManager get_vars() 24160 1726853527.09507: done with get_vars() 24160 1726853527.09521: variable 'omit' from source: magic vars PLAY [Play for testing ipv6 disabled] ****************************************** 24160 1726853527.09690: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 24160 1726853527.09710: getting the remaining hosts for this loop 24160 1726853527.09711: done getting the remaining hosts for this loop 24160 1726853527.09712: getting the next task for host managed_node1 24160 1726853527.09714: done getting next task for host managed_node1 24160 1726853527.09715: ^ task is: TASK: Gathering Facts 24160 1726853527.09716: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853527.09717: getting variables 24160 1726853527.09717: in VariableManager get_vars() 24160 1726853527.09725: Calling all_inventory to load vars for managed_node1 24160 1726853527.09727: Calling groups_inventory to load vars for managed_node1 24160 1726853527.09728: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853527.09731: Calling all_plugins_play to load vars for managed_node1 24160 1726853527.09739: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853527.09741: Calling groups_plugins_play to load vars for managed_node1 24160 1726853527.09822: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853527.09932: done with get_vars() 24160 1726853527.09937: done getting variables 24160 1726853527.09967: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:3 Friday 20 September 2024 13:32:07 -0400 (0:00:00.031) 0:00:03.502 ****** 24160 1726853527.09984: entering _queue_task() for managed_node1/gather_facts 24160 1726853527.10142: worker is 1 (out of 1 available) 24160 1726853527.10155: exiting _queue_task() for managed_node1/gather_facts 24160 1726853527.10165: done queuing things up, now waiting for results queue to drain 24160 1726853527.10167: waiting for pending results... 24160 1726853527.10301: running TaskExecutor() for managed_node1/TASK: Gathering Facts 24160 1726853527.10350: in run() - task 02083763-bbaf-5676-4eb4-0000000000ff 24160 1726853527.10361: variable 'ansible_search_path' from source: unknown 24160 1726853527.10390: calling self._execute() 24160 1726853527.10445: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853527.10448: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853527.10458: variable 'omit' from source: magic vars 24160 1726853527.10706: variable 'ansible_distribution_major_version' from source: facts 24160 1726853527.10716: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853527.10727: variable 'omit' from source: magic vars 24160 1726853527.10743: variable 'omit' from source: magic vars 24160 1726853527.10766: variable 'omit' from source: magic vars 24160 1726853527.10796: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24160 1726853527.10821: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24160 1726853527.10839: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24160 1726853527.10856: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853527.10863: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853527.10886: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24160 1726853527.10889: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853527.10892: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853527.10959: Set connection var ansible_shell_executable to /bin/sh 24160 1726853527.10962: Set connection var ansible_pipelining to False 24160 1726853527.10965: Set connection var ansible_connection to ssh 24160 1726853527.10974: Set connection var ansible_shell_type to sh 24160 1726853527.10981: Set connection var ansible_module_compression to ZIP_DEFLATED 24160 1726853527.10988: Set connection var ansible_timeout to 10 24160 1726853527.11004: variable 'ansible_shell_executable' from source: unknown 24160 1726853527.11007: variable 'ansible_connection' from source: unknown 24160 1726853527.11009: variable 'ansible_module_compression' from source: unknown 24160 1726853527.11012: variable 'ansible_shell_type' from source: unknown 24160 1726853527.11015: variable 'ansible_shell_executable' from source: unknown 24160 1726853527.11017: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853527.11019: variable 'ansible_pipelining' from source: unknown 24160 1726853527.11023: variable 'ansible_timeout' from source: unknown 24160 1726853527.11026: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853527.11159: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 24160 1726853527.11176: variable 'omit' from source: magic vars 24160 1726853527.11179: starting attempt loop 24160 1726853527.11182: running the handler 24160 1726853527.11191: variable 'ansible_facts' from source: unknown 24160 1726853527.11207: _low_level_execute_command(): starting 24160 1726853527.11214: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24160 1726853527.11703: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853527.11709: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853527.11711: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853527.11714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853527.11769: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853527.11777: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853527.11779: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853527.11820: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853527.13497: stdout chunk (state=3): >>>/root <<< 24160 1726853527.13593: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853527.13617: stderr chunk (state=3): >>><<< 24160 1726853527.13621: stdout chunk (state=3): >>><<< 24160 1726853527.13638: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853527.13655: _low_level_execute_command(): starting 24160 1726853527.13660: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853527.1363833-24362-241641677543741 `" && echo ansible-tmp-1726853527.1363833-24362-241641677543741="` echo /root/.ansible/tmp/ansible-tmp-1726853527.1363833-24362-241641677543741 `" ) && sleep 0' 24160 1726853527.14112: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853527.14115: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 24160 1726853527.14118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration <<< 24160 1726853527.14127: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24160 1726853527.14129: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853527.14175: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853527.14179: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853527.14224: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853527.16112: stdout chunk (state=3): >>>ansible-tmp-1726853527.1363833-24362-241641677543741=/root/.ansible/tmp/ansible-tmp-1726853527.1363833-24362-241641677543741 <<< 24160 1726853527.16224: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853527.16249: stderr chunk (state=3): >>><<< 24160 1726853527.16251: stdout chunk (state=3): >>><<< 24160 1726853527.16263: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853527.1363833-24362-241641677543741=/root/.ansible/tmp/ansible-tmp-1726853527.1363833-24362-241641677543741 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853527.16399: variable 'ansible_module_compression' from source: unknown 24160 1726853527.16402: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24160jdl187cr/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 24160 1726853527.16404: variable 'ansible_facts' from source: unknown 24160 1726853527.16515: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853527.1363833-24362-241641677543741/AnsiballZ_setup.py 24160 1726853527.16607: Sending initial data 24160 1726853527.16611: Sent initial data (154 bytes) 24160 1726853527.17036: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853527.17039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 24160 1726853527.17041: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853527.17043: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853527.17045: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853527.17098: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853527.17102: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853527.17142: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853527.18681: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 24160 1726853527.18684: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24160 1726853527.18709: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24160 1726853527.18747: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24160jdl187cr/tmp99pjee2k /root/.ansible/tmp/ansible-tmp-1726853527.1363833-24362-241641677543741/AnsiballZ_setup.py <<< 24160 1726853527.18757: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853527.1363833-24362-241641677543741/AnsiballZ_setup.py" <<< 24160 1726853527.18793: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24160jdl187cr/tmp99pjee2k" to remote "/root/.ansible/tmp/ansible-tmp-1726853527.1363833-24362-241641677543741/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853527.1363833-24362-241641677543741/AnsiballZ_setup.py" <<< 24160 1726853527.19769: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853527.19805: stderr chunk (state=3): >>><<< 24160 1726853527.19808: stdout chunk (state=3): >>><<< 24160 1726853527.19823: done transferring module to remote 24160 1726853527.19831: _low_level_execute_command(): starting 24160 1726853527.19836: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853527.1363833-24362-241641677543741/ /root/.ansible/tmp/ansible-tmp-1726853527.1363833-24362-241641677543741/AnsiballZ_setup.py && sleep 0' 24160 1726853527.20256: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853527.20259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 24160 1726853527.20261: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853527.20263: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853527.20266: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853527.20318: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853527.20322: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853527.20365: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853527.22087: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853527.22111: stderr chunk (state=3): >>><<< 24160 1726853527.22114: stdout chunk (state=3): >>><<< 24160 1726853527.22124: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853527.22127: _low_level_execute_command(): starting 24160 1726853527.22132: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853527.1363833-24362-241641677543741/AnsiballZ_setup.py && sleep 0' 24160 1726853527.22549: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853527.22552: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853527.22557: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address <<< 24160 1726853527.22559: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853527.22561: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853527.22616: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853527.22618: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853527.22658: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853527.85810: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fibre_channel_wwn": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-153", "ansible_nodename": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26b9e88796a7cb9ebdc2656ce384f6", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCXYnrsBaUY4i0/t1QUWoZkFPnFbGncbkmF01/zUZNuwldCwqYoDxpR2K8ARraEuK9oVLyYO0alCszdGP42db6R4xfRCOhN3faeZXsneupsJk4LLpIBkq0uIokeAtcL1tPOUQQzfsQqzZzp4BmJCVrwmUW5ADnzqCgvB3gsruyTQUrEUJ9MtB5zdaQm5MXuipjeZQThTjYCB2aXv/qTdzfKAwds3CoSZ6HA5GNdi6tahsy3CRIn6VtVkvwrqjJcwo+RaRQzjh+C9AUoH2YQmfLbvog62MsnLk/5OPq5HhxO81pm/TJDsI4LXwLh1VviMOWzVvIaXuKwdmYAdgX1NU561bBzeYzi55qBKo4TcMmnOXiV+Es7dDrKjwwpQKsv5tjSqVkcO6ek3I6SI38DXFOBLZtqXOOLsO12iOReYJUWe/+cgBrz12kDCPmaHFzFFZ3+N0GQ/WiYcgqiUItIhb3xJTbPqls0czPCpCzKo57GyTmv17fpfGhBfRoGX/H1zYs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDOnt+7F+RlMaCRRXs8YyuhiHP1FmeDlj4rNB/K2mg1iP9loXXc/XjJ083xMBDu7m7uYLGB+dnmj299Y+RcAQpE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKmxoIJtw8UORlc+o+Q7Pks5ERSyhMLnl+Oo8W221WGj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_fips": false, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_is_chroot": false, "ansible_loadavg": {"1m": 0.38916015625, "5m": 0.337890625, "15m": 0.18505859375}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_lsb": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_local": {}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::3a:e7ff:fe40:bc9f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.153"], "ansible_all_ipv6_addresses": ["fe80::3a:e7ff:fe40:bc9f"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.153", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::3a:e7ff:fe40:bc9f"]}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 44238 10.31.45.153 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 44238 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2958, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 573, "free": 2958}, "nocache": {"free": 3297, "used": 234}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_uuid": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 693, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794680832, "block_size": 4096, "block_total": 65519099, "block_available": 63914717, "block_used": 1604382, "inode_total": 131070960, "inode_available": 131029067, "inode_used": 41893, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_apparmor": {"status": "disabled"}, "ansible_iscsi_iqn": "", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "32", "second": "07", "epoch": "1726853527", "epoch_int": "1726853527", "date": "2024-09-20", "time": "13:32:07", "iso8601_micro": "2024-09-20T17:32:07.853683Z", "iso8601": "2024-09-20T17:32:07Z", "iso8601_basic": "20240920T133207853683", "iso8601_basic_short": "20240920T133207", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_pkg_mgr": "dnf", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 24160 1726853527.87745: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 24160 1726853527.87792: stderr chunk (state=3): >>><<< 24160 1726853527.87795: stdout chunk (state=3): >>><<< 24160 1726853527.87826: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fibre_channel_wwn": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-153", "ansible_nodename": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26b9e88796a7cb9ebdc2656ce384f6", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCXYnrsBaUY4i0/t1QUWoZkFPnFbGncbkmF01/zUZNuwldCwqYoDxpR2K8ARraEuK9oVLyYO0alCszdGP42db6R4xfRCOhN3faeZXsneupsJk4LLpIBkq0uIokeAtcL1tPOUQQzfsQqzZzp4BmJCVrwmUW5ADnzqCgvB3gsruyTQUrEUJ9MtB5zdaQm5MXuipjeZQThTjYCB2aXv/qTdzfKAwds3CoSZ6HA5GNdi6tahsy3CRIn6VtVkvwrqjJcwo+RaRQzjh+C9AUoH2YQmfLbvog62MsnLk/5OPq5HhxO81pm/TJDsI4LXwLh1VviMOWzVvIaXuKwdmYAdgX1NU561bBzeYzi55qBKo4TcMmnOXiV+Es7dDrKjwwpQKsv5tjSqVkcO6ek3I6SI38DXFOBLZtqXOOLsO12iOReYJUWe/+cgBrz12kDCPmaHFzFFZ3+N0GQ/WiYcgqiUItIhb3xJTbPqls0czPCpCzKo57GyTmv17fpfGhBfRoGX/H1zYs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDOnt+7F+RlMaCRRXs8YyuhiHP1FmeDlj4rNB/K2mg1iP9loXXc/XjJ083xMBDu7m7uYLGB+dnmj299Y+RcAQpE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKmxoIJtw8UORlc+o+Q7Pks5ERSyhMLnl+Oo8W221WGj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_fips": false, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_is_chroot": false, "ansible_loadavg": {"1m": 0.38916015625, "5m": 0.337890625, "15m": 0.18505859375}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_lsb": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_local": {}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::3a:e7ff:fe40:bc9f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.153"], "ansible_all_ipv6_addresses": ["fe80::3a:e7ff:fe40:bc9f"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.153", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::3a:e7ff:fe40:bc9f"]}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 44238 10.31.45.153 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 44238 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2958, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 573, "free": 2958}, "nocache": {"free": 3297, "used": 234}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_uuid": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 693, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794680832, "block_size": 4096, "block_total": 65519099, "block_available": 63914717, "block_used": 1604382, "inode_total": 131070960, "inode_available": 131029067, "inode_used": 41893, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_apparmor": {"status": "disabled"}, "ansible_iscsi_iqn": "", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "32", "second": "07", "epoch": "1726853527", "epoch_int": "1726853527", "date": "2024-09-20", "time": "13:32:07", "iso8601_micro": "2024-09-20T17:32:07.853683Z", "iso8601": "2024-09-20T17:32:07Z", "iso8601_basic": "20240920T133207853683", "iso8601_basic_short": "20240920T133207", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_pkg_mgr": "dnf", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 24160 1726853527.88278: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853527.1363833-24362-241641677543741/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24160 1726853527.88282: _low_level_execute_command(): starting 24160 1726853527.88284: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853527.1363833-24362-241641677543741/ > /dev/null 2>&1 && sleep 0' 24160 1726853527.88876: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853527.88893: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853527.88909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853527.88929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24160 1726853527.88946: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 24160 1726853527.88962: stderr chunk (state=3): >>>debug2: match not found <<< 24160 1726853527.89060: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853527.89087: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853527.89163: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853527.91249: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853527.91263: stdout chunk (state=3): >>><<< 24160 1726853527.91285: stderr chunk (state=3): >>><<< 24160 1726853527.91308: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853527.91362: handler run complete 24160 1726853527.91877: variable 'ansible_facts' from source: unknown 24160 1726853527.92087: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853527.92899: variable 'ansible_facts' from source: unknown 24160 1726853527.93031: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853527.93428: attempt loop complete, returning result 24160 1726853527.93456: _execute() done 24160 1726853527.93459: dumping result to json 24160 1726853527.93493: done dumping result, returning 24160 1726853527.93507: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [02083763-bbaf-5676-4eb4-0000000000ff] 24160 1726853527.93519: sending task result for task 02083763-bbaf-5676-4eb4-0000000000ff ok: [managed_node1] 24160 1726853527.95105: no more pending results, returning what we have 24160 1726853527.95108: results queue empty 24160 1726853527.95109: checking for any_errors_fatal 24160 1726853527.95110: done checking for any_errors_fatal 24160 1726853527.95111: checking for max_fail_percentage 24160 1726853527.95112: done checking for max_fail_percentage 24160 1726853527.95113: checking to see if all hosts have failed and the running result is not ok 24160 1726853527.95121: done checking to see if all hosts have failed 24160 1726853527.95122: getting the remaining hosts for this loop 24160 1726853527.95123: done getting the remaining hosts for this loop 24160 1726853527.95126: getting the next task for host managed_node1 24160 1726853527.95132: done getting next task for host managed_node1 24160 1726853527.95133: ^ task is: TASK: meta (flush_handlers) 24160 1726853527.95136: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853527.95139: getting variables 24160 1726853527.95140: in VariableManager get_vars() 24160 1726853527.95168: Calling all_inventory to load vars for managed_node1 24160 1726853527.95181: done sending task result for task 02083763-bbaf-5676-4eb4-0000000000ff 24160 1726853527.95184: WORKER PROCESS EXITING 24160 1726853527.95186: Calling groups_inventory to load vars for managed_node1 24160 1726853527.95189: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853527.95199: Calling all_plugins_play to load vars for managed_node1 24160 1726853527.95202: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853527.95207: Calling groups_plugins_play to load vars for managed_node1 24160 1726853527.95753: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853527.96287: done with get_vars() 24160 1726853527.96298: done getting variables 24160 1726853527.96402: in VariableManager get_vars() 24160 1726853527.96414: Calling all_inventory to load vars for managed_node1 24160 1726853527.96416: Calling groups_inventory to load vars for managed_node1 24160 1726853527.96418: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853527.96538: Calling all_plugins_play to load vars for managed_node1 24160 1726853527.96542: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853527.96545: Calling groups_plugins_play to load vars for managed_node1 24160 1726853527.96809: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853527.97314: done with get_vars() 24160 1726853527.97332: done queuing things up, now waiting for results queue to drain 24160 1726853527.97334: results queue empty 24160 1726853527.97335: checking for any_errors_fatal 24160 1726853527.97338: done checking for any_errors_fatal 24160 1726853527.97345: checking for max_fail_percentage 24160 1726853527.97346: done checking for max_fail_percentage 24160 1726853527.97347: checking to see if all hosts have failed and the running result is not ok 24160 1726853527.97347: done checking to see if all hosts have failed 24160 1726853527.97348: getting the remaining hosts for this loop 24160 1726853527.97349: done getting the remaining hosts for this loop 24160 1726853527.97352: getting the next task for host managed_node1 24160 1726853527.97356: done getting next task for host managed_node1 24160 1726853527.97358: ^ task is: TASK: Set type={{ type }} and interface={{ interface }} 24160 1726853527.97360: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853527.97365: getting variables 24160 1726853527.97366: in VariableManager get_vars() 24160 1726853527.97380: Calling all_inventory to load vars for managed_node1 24160 1726853527.97382: Calling groups_inventory to load vars for managed_node1 24160 1726853527.97384: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853527.97388: Calling all_plugins_play to load vars for managed_node1 24160 1726853527.97390: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853527.97393: Calling groups_plugins_play to load vars for managed_node1 24160 1726853527.97761: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853527.98213: done with get_vars() 24160 1726853527.98221: done getting variables 24160 1726853527.98262: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 24160 1726853527.98633: variable 'type' from source: play vars 24160 1726853527.98640: variable 'interface' from source: play vars TASK [Set type=veth and interface=ethtest0] ************************************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:10 Friday 20 September 2024 13:32:07 -0400 (0:00:00.886) 0:00:04.389 ****** 24160 1726853527.98680: entering _queue_task() for managed_node1/set_fact 24160 1726853527.99107: worker is 1 (out of 1 available) 24160 1726853527.99117: exiting _queue_task() for managed_node1/set_fact 24160 1726853527.99127: done queuing things up, now waiting for results queue to drain 24160 1726853527.99128: waiting for pending results... 24160 1726853527.99296: running TaskExecutor() for managed_node1/TASK: Set type=veth and interface=ethtest0 24160 1726853527.99400: in run() - task 02083763-bbaf-5676-4eb4-00000000000b 24160 1726853527.99428: variable 'ansible_search_path' from source: unknown 24160 1726853527.99483: calling self._execute() 24160 1726853527.99676: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853527.99681: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853527.99683: variable 'omit' from source: magic vars 24160 1726853527.99998: variable 'ansible_distribution_major_version' from source: facts 24160 1726853528.00025: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853528.00038: variable 'omit' from source: magic vars 24160 1726853528.00065: variable 'omit' from source: magic vars 24160 1726853528.00099: variable 'type' from source: play vars 24160 1726853528.00241: variable 'type' from source: play vars 24160 1726853528.00244: variable 'interface' from source: play vars 24160 1726853528.00263: variable 'interface' from source: play vars 24160 1726853528.00286: variable 'omit' from source: magic vars 24160 1726853528.00328: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24160 1726853528.00384: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24160 1726853528.00409: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24160 1726853528.00432: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853528.00462: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853528.00499: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24160 1726853528.00568: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853528.00574: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853528.00643: Set connection var ansible_shell_executable to /bin/sh 24160 1726853528.00656: Set connection var ansible_pipelining to False 24160 1726853528.00664: Set connection var ansible_connection to ssh 24160 1726853528.00689: Set connection var ansible_shell_type to sh 24160 1726853528.00703: Set connection var ansible_module_compression to ZIP_DEFLATED 24160 1726853528.00725: Set connection var ansible_timeout to 10 24160 1726853528.00753: variable 'ansible_shell_executable' from source: unknown 24160 1726853528.00767: variable 'ansible_connection' from source: unknown 24160 1726853528.00876: variable 'ansible_module_compression' from source: unknown 24160 1726853528.00880: variable 'ansible_shell_type' from source: unknown 24160 1726853528.00882: variable 'ansible_shell_executable' from source: unknown 24160 1726853528.00887: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853528.00890: variable 'ansible_pipelining' from source: unknown 24160 1726853528.00894: variable 'ansible_timeout' from source: unknown 24160 1726853528.00897: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853528.00993: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 24160 1726853528.01026: variable 'omit' from source: magic vars 24160 1726853528.01038: starting attempt loop 24160 1726853528.01120: running the handler 24160 1726853528.01125: handler run complete 24160 1726853528.01128: attempt loop complete, returning result 24160 1726853528.01130: _execute() done 24160 1726853528.01132: dumping result to json 24160 1726853528.01134: done dumping result, returning 24160 1726853528.01136: done running TaskExecutor() for managed_node1/TASK: Set type=veth and interface=ethtest0 [02083763-bbaf-5676-4eb4-00000000000b] 24160 1726853528.01139: sending task result for task 02083763-bbaf-5676-4eb4-00000000000b ok: [managed_node1] => { "ansible_facts": { "interface": "ethtest0", "type": "veth" }, "changed": false } 24160 1726853528.01258: no more pending results, returning what we have 24160 1726853528.01261: results queue empty 24160 1726853528.01262: checking for any_errors_fatal 24160 1726853528.01264: done checking for any_errors_fatal 24160 1726853528.01265: checking for max_fail_percentage 24160 1726853528.01267: done checking for max_fail_percentage 24160 1726853528.01268: checking to see if all hosts have failed and the running result is not ok 24160 1726853528.01269: done checking to see if all hosts have failed 24160 1726853528.01270: getting the remaining hosts for this loop 24160 1726853528.01273: done getting the remaining hosts for this loop 24160 1726853528.01277: getting the next task for host managed_node1 24160 1726853528.01282: done getting next task for host managed_node1 24160 1726853528.01285: ^ task is: TASK: Include the task 'show_interfaces.yml' 24160 1726853528.01287: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853528.01291: getting variables 24160 1726853528.01293: in VariableManager get_vars() 24160 1726853528.01333: Calling all_inventory to load vars for managed_node1 24160 1726853528.01336: Calling groups_inventory to load vars for managed_node1 24160 1726853528.01340: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853528.01349: Calling all_plugins_play to load vars for managed_node1 24160 1726853528.01353: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853528.01356: Calling groups_plugins_play to load vars for managed_node1 24160 1726853528.01734: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853528.02062: done with get_vars() 24160 1726853528.02076: done getting variables 24160 1726853528.02116: done sending task result for task 02083763-bbaf-5676-4eb4-00000000000b 24160 1726853528.02119: WORKER PROCESS EXITING TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:14 Friday 20 September 2024 13:32:08 -0400 (0:00:00.035) 0:00:04.424 ****** 24160 1726853528.02188: entering _queue_task() for managed_node1/include_tasks 24160 1726853528.02508: worker is 1 (out of 1 available) 24160 1726853528.02520: exiting _queue_task() for managed_node1/include_tasks 24160 1726853528.02531: done queuing things up, now waiting for results queue to drain 24160 1726853528.02533: waiting for pending results... 24160 1726853528.02751: running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' 24160 1726853528.02843: in run() - task 02083763-bbaf-5676-4eb4-00000000000c 24160 1726853528.02873: variable 'ansible_search_path' from source: unknown 24160 1726853528.02918: calling self._execute() 24160 1726853528.03009: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853528.03020: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853528.03034: variable 'omit' from source: magic vars 24160 1726853528.03475: variable 'ansible_distribution_major_version' from source: facts 24160 1726853528.03479: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853528.03482: _execute() done 24160 1726853528.03484: dumping result to json 24160 1726853528.03486: done dumping result, returning 24160 1726853528.03488: done running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' [02083763-bbaf-5676-4eb4-00000000000c] 24160 1726853528.03490: sending task result for task 02083763-bbaf-5676-4eb4-00000000000c 24160 1726853528.03767: no more pending results, returning what we have 24160 1726853528.03775: in VariableManager get_vars() 24160 1726853528.03813: Calling all_inventory to load vars for managed_node1 24160 1726853528.03816: Calling groups_inventory to load vars for managed_node1 24160 1726853528.03818: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853528.03829: Calling all_plugins_play to load vars for managed_node1 24160 1726853528.03832: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853528.03835: Calling groups_plugins_play to load vars for managed_node1 24160 1726853528.04162: done sending task result for task 02083763-bbaf-5676-4eb4-00000000000c 24160 1726853528.04166: WORKER PROCESS EXITING 24160 1726853528.04204: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853528.04385: done with get_vars() 24160 1726853528.04393: variable 'ansible_search_path' from source: unknown 24160 1726853528.04405: we have included files to process 24160 1726853528.04413: generating all_blocks data 24160 1726853528.04414: done generating all_blocks data 24160 1726853528.04415: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 24160 1726853528.04416: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 24160 1726853528.04419: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 24160 1726853528.04581: in VariableManager get_vars() 24160 1726853528.04598: done with get_vars() 24160 1726853528.04717: done processing included file 24160 1726853528.04719: iterating over new_blocks loaded from include file 24160 1726853528.04721: in VariableManager get_vars() 24160 1726853528.04742: done with get_vars() 24160 1726853528.04744: filtering new block on tags 24160 1726853528.04764: done filtering new block on tags 24160 1726853528.04766: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node1 24160 1726853528.04776: extending task lists for all hosts with included blocks 24160 1726853528.05929: done extending task lists 24160 1726853528.05931: done processing included files 24160 1726853528.05938: results queue empty 24160 1726853528.05939: checking for any_errors_fatal 24160 1726853528.05942: done checking for any_errors_fatal 24160 1726853528.05943: checking for max_fail_percentage 24160 1726853528.05944: done checking for max_fail_percentage 24160 1726853528.05944: checking to see if all hosts have failed and the running result is not ok 24160 1726853528.05945: done checking to see if all hosts have failed 24160 1726853528.05946: getting the remaining hosts for this loop 24160 1726853528.05948: done getting the remaining hosts for this loop 24160 1726853528.05951: getting the next task for host managed_node1 24160 1726853528.05956: done getting next task for host managed_node1 24160 1726853528.05959: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 24160 1726853528.05961: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853528.05963: getting variables 24160 1726853528.05964: in VariableManager get_vars() 24160 1726853528.05978: Calling all_inventory to load vars for managed_node1 24160 1726853528.05981: Calling groups_inventory to load vars for managed_node1 24160 1726853528.05983: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853528.05988: Calling all_plugins_play to load vars for managed_node1 24160 1726853528.05991: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853528.05998: Calling groups_plugins_play to load vars for managed_node1 24160 1726853528.06181: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853528.06387: done with get_vars() 24160 1726853528.06401: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 13:32:08 -0400 (0:00:00.042) 0:00:04.467 ****** 24160 1726853528.06469: entering _queue_task() for managed_node1/include_tasks 24160 1726853528.06761: worker is 1 (out of 1 available) 24160 1726853528.06878: exiting _queue_task() for managed_node1/include_tasks 24160 1726853528.06888: done queuing things up, now waiting for results queue to drain 24160 1726853528.06889: waiting for pending results... 24160 1726853528.07063: running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' 24160 1726853528.07174: in run() - task 02083763-bbaf-5676-4eb4-000000000115 24160 1726853528.07195: variable 'ansible_search_path' from source: unknown 24160 1726853528.07204: variable 'ansible_search_path' from source: unknown 24160 1726853528.07259: calling self._execute() 24160 1726853528.07362: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853528.07366: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853528.07377: variable 'omit' from source: magic vars 24160 1726853528.07798: variable 'ansible_distribution_major_version' from source: facts 24160 1726853528.07802: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853528.07804: _execute() done 24160 1726853528.07806: dumping result to json 24160 1726853528.07808: done dumping result, returning 24160 1726853528.07811: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' [02083763-bbaf-5676-4eb4-000000000115] 24160 1726853528.07813: sending task result for task 02083763-bbaf-5676-4eb4-000000000115 24160 1726853528.07923: no more pending results, returning what we have 24160 1726853528.07929: in VariableManager get_vars() 24160 1726853528.07967: Calling all_inventory to load vars for managed_node1 24160 1726853528.07970: Calling groups_inventory to load vars for managed_node1 24160 1726853528.07975: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853528.07986: Calling all_plugins_play to load vars for managed_node1 24160 1726853528.07989: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853528.07992: Calling groups_plugins_play to load vars for managed_node1 24160 1726853528.08357: done sending task result for task 02083763-bbaf-5676-4eb4-000000000115 24160 1726853528.08360: WORKER PROCESS EXITING 24160 1726853528.08383: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853528.08633: done with get_vars() 24160 1726853528.08642: variable 'ansible_search_path' from source: unknown 24160 1726853528.08643: variable 'ansible_search_path' from source: unknown 24160 1726853528.08685: we have included files to process 24160 1726853528.08686: generating all_blocks data 24160 1726853528.08688: done generating all_blocks data 24160 1726853528.08689: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 24160 1726853528.08690: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 24160 1726853528.08691: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 24160 1726853528.09059: done processing included file 24160 1726853528.09061: iterating over new_blocks loaded from include file 24160 1726853528.09063: in VariableManager get_vars() 24160 1726853528.09079: done with get_vars() 24160 1726853528.09081: filtering new block on tags 24160 1726853528.09097: done filtering new block on tags 24160 1726853528.09100: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node1 24160 1726853528.09104: extending task lists for all hosts with included blocks 24160 1726853528.09234: done extending task lists 24160 1726853528.09236: done processing included files 24160 1726853528.09237: results queue empty 24160 1726853528.09237: checking for any_errors_fatal 24160 1726853528.09240: done checking for any_errors_fatal 24160 1726853528.09241: checking for max_fail_percentage 24160 1726853528.09242: done checking for max_fail_percentage 24160 1726853528.09243: checking to see if all hosts have failed and the running result is not ok 24160 1726853528.09243: done checking to see if all hosts have failed 24160 1726853528.09244: getting the remaining hosts for this loop 24160 1726853528.09245: done getting the remaining hosts for this loop 24160 1726853528.09248: getting the next task for host managed_node1 24160 1726853528.09252: done getting next task for host managed_node1 24160 1726853528.09253: ^ task is: TASK: Gather current interface info 24160 1726853528.09256: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853528.09258: getting variables 24160 1726853528.09259: in VariableManager get_vars() 24160 1726853528.09316: Calling all_inventory to load vars for managed_node1 24160 1726853528.09319: Calling groups_inventory to load vars for managed_node1 24160 1726853528.09321: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853528.09326: Calling all_plugins_play to load vars for managed_node1 24160 1726853528.09328: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853528.09331: Calling groups_plugins_play to load vars for managed_node1 24160 1726853528.09511: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853528.09719: done with get_vars() 24160 1726853528.09727: done getting variables 24160 1726853528.09765: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 13:32:08 -0400 (0:00:00.033) 0:00:04.500 ****** 24160 1726853528.09797: entering _queue_task() for managed_node1/command 24160 1726853528.10033: worker is 1 (out of 1 available) 24160 1726853528.10045: exiting _queue_task() for managed_node1/command 24160 1726853528.10058: done queuing things up, now waiting for results queue to drain 24160 1726853528.10059: waiting for pending results... 24160 1726853528.10223: running TaskExecutor() for managed_node1/TASK: Gather current interface info 24160 1726853528.10291: in run() - task 02083763-bbaf-5676-4eb4-000000000192 24160 1726853528.10300: variable 'ansible_search_path' from source: unknown 24160 1726853528.10309: variable 'ansible_search_path' from source: unknown 24160 1726853528.10344: calling self._execute() 24160 1726853528.10418: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853528.10422: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853528.10430: variable 'omit' from source: magic vars 24160 1726853528.10793: variable 'ansible_distribution_major_version' from source: facts 24160 1726853528.10800: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853528.10806: variable 'omit' from source: magic vars 24160 1726853528.10833: variable 'omit' from source: magic vars 24160 1726853528.10863: variable 'omit' from source: magic vars 24160 1726853528.10894: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24160 1726853528.10921: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24160 1726853528.10936: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24160 1726853528.10952: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853528.10962: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853528.10986: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24160 1726853528.10989: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853528.10992: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853528.11076: Set connection var ansible_shell_executable to /bin/sh 24160 1726853528.11080: Set connection var ansible_pipelining to False 24160 1726853528.11082: Set connection var ansible_connection to ssh 24160 1726853528.11085: Set connection var ansible_shell_type to sh 24160 1726853528.11087: Set connection var ansible_module_compression to ZIP_DEFLATED 24160 1726853528.11097: Set connection var ansible_timeout to 10 24160 1726853528.11112: variable 'ansible_shell_executable' from source: unknown 24160 1726853528.11114: variable 'ansible_connection' from source: unknown 24160 1726853528.11117: variable 'ansible_module_compression' from source: unknown 24160 1726853528.11121: variable 'ansible_shell_type' from source: unknown 24160 1726853528.11124: variable 'ansible_shell_executable' from source: unknown 24160 1726853528.11126: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853528.11128: variable 'ansible_pipelining' from source: unknown 24160 1726853528.11131: variable 'ansible_timeout' from source: unknown 24160 1726853528.11133: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853528.11231: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 24160 1726853528.11239: variable 'omit' from source: magic vars 24160 1726853528.11248: starting attempt loop 24160 1726853528.11251: running the handler 24160 1726853528.11263: _low_level_execute_command(): starting 24160 1726853528.11270: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24160 1726853528.11861: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853528.11865: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853528.11869: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 24160 1726853528.11873: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853528.11948: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853528.11956: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853528.11961: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853528.12014: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853528.13816: stdout chunk (state=3): >>>/root <<< 24160 1726853528.13923: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853528.13926: stdout chunk (state=3): >>><<< 24160 1726853528.13928: stderr chunk (state=3): >>><<< 24160 1726853528.14068: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853528.14074: _low_level_execute_command(): starting 24160 1726853528.14077: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853528.1396813-24402-137807884259281 `" && echo ansible-tmp-1726853528.1396813-24402-137807884259281="` echo /root/.ansible/tmp/ansible-tmp-1726853528.1396813-24402-137807884259281 `" ) && sleep 0' 24160 1726853528.14603: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853528.14612: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24160 1726853528.14615: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853528.14624: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853528.14626: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24160 1726853528.14640: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853528.14662: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853528.14674: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853528.14732: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853528.16639: stdout chunk (state=3): >>>ansible-tmp-1726853528.1396813-24402-137807884259281=/root/.ansible/tmp/ansible-tmp-1726853528.1396813-24402-137807884259281 <<< 24160 1726853528.16759: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853528.16763: stdout chunk (state=3): >>><<< 24160 1726853528.16773: stderr chunk (state=3): >>><<< 24160 1726853528.16786: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853528.1396813-24402-137807884259281=/root/.ansible/tmp/ansible-tmp-1726853528.1396813-24402-137807884259281 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853528.16810: variable 'ansible_module_compression' from source: unknown 24160 1726853528.16846: ANSIBALLZ: Using generic lock for ansible.legacy.command 24160 1726853528.16849: ANSIBALLZ: Acquiring lock 24160 1726853528.16851: ANSIBALLZ: Lock acquired: 140302803944608 24160 1726853528.16855: ANSIBALLZ: Creating module 24160 1726853528.26784: ANSIBALLZ: Writing module into payload 24160 1726853528.26815: ANSIBALLZ: Writing module 24160 1726853528.26835: ANSIBALLZ: Renaming module 24160 1726853528.26839: ANSIBALLZ: Done creating module 24160 1726853528.26861: variable 'ansible_facts' from source: unknown 24160 1726853528.26931: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853528.1396813-24402-137807884259281/AnsiballZ_command.py 24160 1726853528.27094: Sending initial data 24160 1726853528.27106: Sent initial data (156 bytes) 24160 1726853528.27694: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853528.27704: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853528.27715: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853528.27785: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853528.27822: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853528.27876: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853528.27882: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853528.27934: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853528.29588: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24160 1726853528.29647: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24160 1726853528.29689: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24160jdl187cr/tmprozalz1c /root/.ansible/tmp/ansible-tmp-1726853528.1396813-24402-137807884259281/AnsiballZ_command.py <<< 24160 1726853528.29692: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853528.1396813-24402-137807884259281/AnsiballZ_command.py" <<< 24160 1726853528.29732: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24160jdl187cr/tmprozalz1c" to remote "/root/.ansible/tmp/ansible-tmp-1726853528.1396813-24402-137807884259281/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853528.1396813-24402-137807884259281/AnsiballZ_command.py" <<< 24160 1726853528.30290: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853528.30321: stderr chunk (state=3): >>><<< 24160 1726853528.30325: stdout chunk (state=3): >>><<< 24160 1726853528.30341: done transferring module to remote 24160 1726853528.30349: _low_level_execute_command(): starting 24160 1726853528.30354: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853528.1396813-24402-137807884259281/ /root/.ansible/tmp/ansible-tmp-1726853528.1396813-24402-137807884259281/AnsiballZ_command.py && sleep 0' 24160 1726853528.30753: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853528.30776: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24160 1726853528.30779: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853528.30786: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853528.30789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853528.30800: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853528.30846: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853528.30849: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853528.30859: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853528.30911: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853528.32708: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853528.32731: stderr chunk (state=3): >>><<< 24160 1726853528.32734: stdout chunk (state=3): >>><<< 24160 1726853528.32746: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853528.32749: _low_level_execute_command(): starting 24160 1726853528.32755: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853528.1396813-24402-137807884259281/AnsiballZ_command.py && sleep 0' 24160 1726853528.33160: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853528.33168: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853528.33189: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853528.33192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853528.33243: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853528.33246: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853528.33317: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853528.48935: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 13:32:08.484993", "end": "2024-09-20 13:32:08.488463", "delta": "0:00:00.003470", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 24160 1726853528.50590: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 24160 1726853528.50665: stderr chunk (state=3): >>><<< 24160 1726853528.50669: stdout chunk (state=3): >>><<< 24160 1726853528.50674: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 13:32:08.484993", "end": "2024-09-20 13:32:08.488463", "delta": "0:00:00.003470", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 24160 1726853528.50722: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853528.1396813-24402-137807884259281/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24160 1726853528.50725: _low_level_execute_command(): starting 24160 1726853528.50728: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853528.1396813-24402-137807884259281/ > /dev/null 2>&1 && sleep 0' 24160 1726853528.51398: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853528.51433: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853528.51451: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853528.53369: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853528.53395: stderr chunk (state=3): >>><<< 24160 1726853528.53398: stdout chunk (state=3): >>><<< 24160 1726853528.53411: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853528.53417: handler run complete 24160 1726853528.53435: Evaluated conditional (False): False 24160 1726853528.53445: attempt loop complete, returning result 24160 1726853528.53448: _execute() done 24160 1726853528.53450: dumping result to json 24160 1726853528.53452: done dumping result, returning 24160 1726853528.53461: done running TaskExecutor() for managed_node1/TASK: Gather current interface info [02083763-bbaf-5676-4eb4-000000000192] 24160 1726853528.53465: sending task result for task 02083763-bbaf-5676-4eb4-000000000192 24160 1726853528.53564: done sending task result for task 02083763-bbaf-5676-4eb4-000000000192 24160 1726853528.53566: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003470", "end": "2024-09-20 13:32:08.488463", "rc": 0, "start": "2024-09-20 13:32:08.484993" } STDOUT: bonding_masters eth0 lo 24160 1726853528.53648: no more pending results, returning what we have 24160 1726853528.53651: results queue empty 24160 1726853528.53652: checking for any_errors_fatal 24160 1726853528.53653: done checking for any_errors_fatal 24160 1726853528.53654: checking for max_fail_percentage 24160 1726853528.53656: done checking for max_fail_percentage 24160 1726853528.53656: checking to see if all hosts have failed and the running result is not ok 24160 1726853528.53657: done checking to see if all hosts have failed 24160 1726853528.53658: getting the remaining hosts for this loop 24160 1726853528.53659: done getting the remaining hosts for this loop 24160 1726853528.53663: getting the next task for host managed_node1 24160 1726853528.53669: done getting next task for host managed_node1 24160 1726853528.53673: ^ task is: TASK: Set current_interfaces 24160 1726853528.53677: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853528.53680: getting variables 24160 1726853528.53734: in VariableManager get_vars() 24160 1726853528.53760: Calling all_inventory to load vars for managed_node1 24160 1726853528.53762: Calling groups_inventory to load vars for managed_node1 24160 1726853528.53764: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853528.53777: Calling all_plugins_play to load vars for managed_node1 24160 1726853528.53780: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853528.53783: Calling groups_plugins_play to load vars for managed_node1 24160 1726853528.53909: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853528.54025: done with get_vars() 24160 1726853528.54033: done getting variables 24160 1726853528.54076: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 13:32:08 -0400 (0:00:00.443) 0:00:04.943 ****** 24160 1726853528.54098: entering _queue_task() for managed_node1/set_fact 24160 1726853528.54304: worker is 1 (out of 1 available) 24160 1726853528.54319: exiting _queue_task() for managed_node1/set_fact 24160 1726853528.54331: done queuing things up, now waiting for results queue to drain 24160 1726853528.54333: waiting for pending results... 24160 1726853528.54486: running TaskExecutor() for managed_node1/TASK: Set current_interfaces 24160 1726853528.54549: in run() - task 02083763-bbaf-5676-4eb4-000000000193 24160 1726853528.54560: variable 'ansible_search_path' from source: unknown 24160 1726853528.54563: variable 'ansible_search_path' from source: unknown 24160 1726853528.54594: calling self._execute() 24160 1726853528.54650: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853528.54663: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853528.54675: variable 'omit' from source: magic vars 24160 1726853528.54945: variable 'ansible_distribution_major_version' from source: facts 24160 1726853528.54955: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853528.54964: variable 'omit' from source: magic vars 24160 1726853528.55076: variable 'omit' from source: magic vars 24160 1726853528.55119: variable '_current_interfaces' from source: set_fact 24160 1726853528.55178: variable 'omit' from source: magic vars 24160 1726853528.55226: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24160 1726853528.55265: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24160 1726853528.55283: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24160 1726853528.55305: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853528.55414: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853528.55417: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24160 1726853528.55420: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853528.55422: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853528.55476: Set connection var ansible_shell_executable to /bin/sh 24160 1726853528.55523: Set connection var ansible_pipelining to False 24160 1726853528.55527: Set connection var ansible_connection to ssh 24160 1726853528.55549: Set connection var ansible_shell_type to sh 24160 1726853528.55553: Set connection var ansible_module_compression to ZIP_DEFLATED 24160 1726853528.55555: Set connection var ansible_timeout to 10 24160 1726853528.55557: variable 'ansible_shell_executable' from source: unknown 24160 1726853528.55560: variable 'ansible_connection' from source: unknown 24160 1726853528.55563: variable 'ansible_module_compression' from source: unknown 24160 1726853528.55565: variable 'ansible_shell_type' from source: unknown 24160 1726853528.55567: variable 'ansible_shell_executable' from source: unknown 24160 1726853528.55569: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853528.55573: variable 'ansible_pipelining' from source: unknown 24160 1726853528.55575: variable 'ansible_timeout' from source: unknown 24160 1726853528.55577: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853528.55748: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 24160 1726853528.55757: variable 'omit' from source: magic vars 24160 1726853528.55759: starting attempt loop 24160 1726853528.55762: running the handler 24160 1726853528.55764: handler run complete 24160 1726853528.55767: attempt loop complete, returning result 24160 1726853528.55769: _execute() done 24160 1726853528.55773: dumping result to json 24160 1726853528.55775: done dumping result, returning 24160 1726853528.55778: done running TaskExecutor() for managed_node1/TASK: Set current_interfaces [02083763-bbaf-5676-4eb4-000000000193] 24160 1726853528.55788: sending task result for task 02083763-bbaf-5676-4eb4-000000000193 24160 1726853528.55888: done sending task result for task 02083763-bbaf-5676-4eb4-000000000193 24160 1726853528.55891: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 24160 1726853528.55944: no more pending results, returning what we have 24160 1726853528.55946: results queue empty 24160 1726853528.55947: checking for any_errors_fatal 24160 1726853528.55955: done checking for any_errors_fatal 24160 1726853528.55955: checking for max_fail_percentage 24160 1726853528.55957: done checking for max_fail_percentage 24160 1726853528.55958: checking to see if all hosts have failed and the running result is not ok 24160 1726853528.55958: done checking to see if all hosts have failed 24160 1726853528.55959: getting the remaining hosts for this loop 24160 1726853528.55961: done getting the remaining hosts for this loop 24160 1726853528.55969: getting the next task for host managed_node1 24160 1726853528.55977: done getting next task for host managed_node1 24160 1726853528.55980: ^ task is: TASK: Show current_interfaces 24160 1726853528.55982: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853528.55985: getting variables 24160 1726853528.55986: in VariableManager get_vars() 24160 1726853528.56024: Calling all_inventory to load vars for managed_node1 24160 1726853528.56026: Calling groups_inventory to load vars for managed_node1 24160 1726853528.56028: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853528.56036: Calling all_plugins_play to load vars for managed_node1 24160 1726853528.56038: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853528.56041: Calling groups_plugins_play to load vars for managed_node1 24160 1726853528.56257: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853528.56444: done with get_vars() 24160 1726853528.56460: done getting variables 24160 1726853528.56558: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 13:32:08 -0400 (0:00:00.024) 0:00:04.968 ****** 24160 1726853528.56589: entering _queue_task() for managed_node1/debug 24160 1726853528.56591: Creating lock for debug 24160 1726853528.56813: worker is 1 (out of 1 available) 24160 1726853528.56827: exiting _queue_task() for managed_node1/debug 24160 1726853528.56838: done queuing things up, now waiting for results queue to drain 24160 1726853528.56840: waiting for pending results... 24160 1726853528.56994: running TaskExecutor() for managed_node1/TASK: Show current_interfaces 24160 1726853528.57111: in run() - task 02083763-bbaf-5676-4eb4-000000000116 24160 1726853528.57116: variable 'ansible_search_path' from source: unknown 24160 1726853528.57119: variable 'ansible_search_path' from source: unknown 24160 1726853528.57142: calling self._execute() 24160 1726853528.57232: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853528.57236: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853528.57245: variable 'omit' from source: magic vars 24160 1726853528.57654: variable 'ansible_distribution_major_version' from source: facts 24160 1726853528.57676: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853528.57680: variable 'omit' from source: magic vars 24160 1726853528.57723: variable 'omit' from source: magic vars 24160 1726853528.57831: variable 'current_interfaces' from source: set_fact 24160 1726853528.57880: variable 'omit' from source: magic vars 24160 1726853528.57900: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24160 1726853528.57930: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24160 1726853528.57994: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24160 1726853528.57998: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853528.58000: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853528.58016: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24160 1726853528.58019: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853528.58024: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853528.58094: Set connection var ansible_shell_executable to /bin/sh 24160 1726853528.58101: Set connection var ansible_pipelining to False 24160 1726853528.58105: Set connection var ansible_connection to ssh 24160 1726853528.58107: Set connection var ansible_shell_type to sh 24160 1726853528.58114: Set connection var ansible_module_compression to ZIP_DEFLATED 24160 1726853528.58123: Set connection var ansible_timeout to 10 24160 1726853528.58165: variable 'ansible_shell_executable' from source: unknown 24160 1726853528.58168: variable 'ansible_connection' from source: unknown 24160 1726853528.58173: variable 'ansible_module_compression' from source: unknown 24160 1726853528.58178: variable 'ansible_shell_type' from source: unknown 24160 1726853528.58180: variable 'ansible_shell_executable' from source: unknown 24160 1726853528.58182: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853528.58184: variable 'ansible_pipelining' from source: unknown 24160 1726853528.58186: variable 'ansible_timeout' from source: unknown 24160 1726853528.58194: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853528.58404: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 24160 1726853528.58414: variable 'omit' from source: magic vars 24160 1726853528.58420: starting attempt loop 24160 1726853528.58422: running the handler 24160 1726853528.58481: handler run complete 24160 1726853528.58484: attempt loop complete, returning result 24160 1726853528.58487: _execute() done 24160 1726853528.58491: dumping result to json 24160 1726853528.58493: done dumping result, returning 24160 1726853528.58496: done running TaskExecutor() for managed_node1/TASK: Show current_interfaces [02083763-bbaf-5676-4eb4-000000000116] 24160 1726853528.58499: sending task result for task 02083763-bbaf-5676-4eb4-000000000116 24160 1726853528.58630: done sending task result for task 02083763-bbaf-5676-4eb4-000000000116 24160 1726853528.58633: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 24160 1726853528.58691: no more pending results, returning what we have 24160 1726853528.58694: results queue empty 24160 1726853528.58695: checking for any_errors_fatal 24160 1726853528.58700: done checking for any_errors_fatal 24160 1726853528.58700: checking for max_fail_percentage 24160 1726853528.58702: done checking for max_fail_percentage 24160 1726853528.58703: checking to see if all hosts have failed and the running result is not ok 24160 1726853528.58704: done checking to see if all hosts have failed 24160 1726853528.58704: getting the remaining hosts for this loop 24160 1726853528.58706: done getting the remaining hosts for this loop 24160 1726853528.58709: getting the next task for host managed_node1 24160 1726853528.58716: done getting next task for host managed_node1 24160 1726853528.58719: ^ task is: TASK: Include the task 'manage_test_interface.yml' 24160 1726853528.58721: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853528.58723: getting variables 24160 1726853528.58725: in VariableManager get_vars() 24160 1726853528.58759: Calling all_inventory to load vars for managed_node1 24160 1726853528.58762: Calling groups_inventory to load vars for managed_node1 24160 1726853528.58764: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853528.58774: Calling all_plugins_play to load vars for managed_node1 24160 1726853528.58780: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853528.58784: Calling groups_plugins_play to load vars for managed_node1 24160 1726853528.58964: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853528.59147: done with get_vars() 24160 1726853528.59159: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:16 Friday 20 September 2024 13:32:08 -0400 (0:00:00.026) 0:00:04.995 ****** 24160 1726853528.59259: entering _queue_task() for managed_node1/include_tasks 24160 1726853528.59539: worker is 1 (out of 1 available) 24160 1726853528.59552: exiting _queue_task() for managed_node1/include_tasks 24160 1726853528.59568: done queuing things up, now waiting for results queue to drain 24160 1726853528.59569: waiting for pending results... 24160 1726853528.59796: running TaskExecutor() for managed_node1/TASK: Include the task 'manage_test_interface.yml' 24160 1726853528.59878: in run() - task 02083763-bbaf-5676-4eb4-00000000000d 24160 1726853528.59895: variable 'ansible_search_path' from source: unknown 24160 1726853528.59930: calling self._execute() 24160 1726853528.60013: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853528.60017: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853528.60023: variable 'omit' from source: magic vars 24160 1726853528.60394: variable 'ansible_distribution_major_version' from source: facts 24160 1726853528.60403: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853528.60409: _execute() done 24160 1726853528.60412: dumping result to json 24160 1726853528.60415: done dumping result, returning 24160 1726853528.60422: done running TaskExecutor() for managed_node1/TASK: Include the task 'manage_test_interface.yml' [02083763-bbaf-5676-4eb4-00000000000d] 24160 1726853528.60427: sending task result for task 02083763-bbaf-5676-4eb4-00000000000d 24160 1726853528.60519: done sending task result for task 02083763-bbaf-5676-4eb4-00000000000d 24160 1726853528.60521: WORKER PROCESS EXITING 24160 1726853528.60546: no more pending results, returning what we have 24160 1726853528.60550: in VariableManager get_vars() 24160 1726853528.60594: Calling all_inventory to load vars for managed_node1 24160 1726853528.60596: Calling groups_inventory to load vars for managed_node1 24160 1726853528.60599: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853528.60609: Calling all_plugins_play to load vars for managed_node1 24160 1726853528.60611: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853528.60614: Calling groups_plugins_play to load vars for managed_node1 24160 1726853528.60801: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853528.60985: done with get_vars() 24160 1726853528.60992: variable 'ansible_search_path' from source: unknown 24160 1726853528.61001: we have included files to process 24160 1726853528.61002: generating all_blocks data 24160 1726853528.61003: done generating all_blocks data 24160 1726853528.61008: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 24160 1726853528.61009: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 24160 1726853528.61012: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 24160 1726853528.61469: in VariableManager get_vars() 24160 1726853528.61490: done with get_vars() 24160 1726853528.61705: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 24160 1726853528.62386: done processing included file 24160 1726853528.62388: iterating over new_blocks loaded from include file 24160 1726853528.62389: in VariableManager get_vars() 24160 1726853528.62400: done with get_vars() 24160 1726853528.62401: filtering new block on tags 24160 1726853528.62420: done filtering new block on tags 24160 1726853528.62421: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed_node1 24160 1726853528.62425: extending task lists for all hosts with included blocks 24160 1726853528.63245: done extending task lists 24160 1726853528.63246: done processing included files 24160 1726853528.63247: results queue empty 24160 1726853528.63247: checking for any_errors_fatal 24160 1726853528.63249: done checking for any_errors_fatal 24160 1726853528.63250: checking for max_fail_percentage 24160 1726853528.63251: done checking for max_fail_percentage 24160 1726853528.63251: checking to see if all hosts have failed and the running result is not ok 24160 1726853528.63252: done checking to see if all hosts have failed 24160 1726853528.63252: getting the remaining hosts for this loop 24160 1726853528.63255: done getting the remaining hosts for this loop 24160 1726853528.63257: getting the next task for host managed_node1 24160 1726853528.63259: done getting next task for host managed_node1 24160 1726853528.63261: ^ task is: TASK: Ensure state in ["present", "absent"] 24160 1726853528.63262: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853528.63266: getting variables 24160 1726853528.63267: in VariableManager get_vars() 24160 1726853528.63286: Calling all_inventory to load vars for managed_node1 24160 1726853528.63289: Calling groups_inventory to load vars for managed_node1 24160 1726853528.63291: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853528.63297: Calling all_plugins_play to load vars for managed_node1 24160 1726853528.63298: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853528.63300: Calling groups_plugins_play to load vars for managed_node1 24160 1726853528.63407: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853528.63528: done with get_vars() 24160 1726853528.63535: done getting variables 24160 1726853528.63586: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Friday 20 September 2024 13:32:08 -0400 (0:00:00.043) 0:00:05.038 ****** 24160 1726853528.63619: entering _queue_task() for managed_node1/fail 24160 1726853528.63621: Creating lock for fail 24160 1726853528.63914: worker is 1 (out of 1 available) 24160 1726853528.63927: exiting _queue_task() for managed_node1/fail 24160 1726853528.63939: done queuing things up, now waiting for results queue to drain 24160 1726853528.63941: waiting for pending results... 24160 1726853528.64190: running TaskExecutor() for managed_node1/TASK: Ensure state in ["present", "absent"] 24160 1726853528.64278: in run() - task 02083763-bbaf-5676-4eb4-0000000001ae 24160 1726853528.64282: variable 'ansible_search_path' from source: unknown 24160 1726853528.64285: variable 'ansible_search_path' from source: unknown 24160 1726853528.64305: calling self._execute() 24160 1726853528.64374: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853528.64377: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853528.64386: variable 'omit' from source: magic vars 24160 1726853528.64658: variable 'ansible_distribution_major_version' from source: facts 24160 1726853528.64678: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853528.64772: variable 'state' from source: include params 24160 1726853528.64776: Evaluated conditional (state not in ["present", "absent"]): False 24160 1726853528.64778: when evaluation is False, skipping this task 24160 1726853528.64781: _execute() done 24160 1726853528.64787: dumping result to json 24160 1726853528.64791: done dumping result, returning 24160 1726853528.64795: done running TaskExecutor() for managed_node1/TASK: Ensure state in ["present", "absent"] [02083763-bbaf-5676-4eb4-0000000001ae] 24160 1726853528.64798: sending task result for task 02083763-bbaf-5676-4eb4-0000000001ae 24160 1726853528.64879: done sending task result for task 02083763-bbaf-5676-4eb4-0000000001ae 24160 1726853528.64882: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 24160 1726853528.64931: no more pending results, returning what we have 24160 1726853528.64935: results queue empty 24160 1726853528.64936: checking for any_errors_fatal 24160 1726853528.64937: done checking for any_errors_fatal 24160 1726853528.64937: checking for max_fail_percentage 24160 1726853528.64939: done checking for max_fail_percentage 24160 1726853528.64939: checking to see if all hosts have failed and the running result is not ok 24160 1726853528.64940: done checking to see if all hosts have failed 24160 1726853528.64941: getting the remaining hosts for this loop 24160 1726853528.64942: done getting the remaining hosts for this loop 24160 1726853528.64945: getting the next task for host managed_node1 24160 1726853528.64950: done getting next task for host managed_node1 24160 1726853528.64952: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 24160 1726853528.64956: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853528.64959: getting variables 24160 1726853528.64961: in VariableManager get_vars() 24160 1726853528.64996: Calling all_inventory to load vars for managed_node1 24160 1726853528.64998: Calling groups_inventory to load vars for managed_node1 24160 1726853528.65000: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853528.65010: Calling all_plugins_play to load vars for managed_node1 24160 1726853528.65013: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853528.65015: Calling groups_plugins_play to load vars for managed_node1 24160 1726853528.65225: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853528.65364: done with get_vars() 24160 1726853528.65372: done getting variables 24160 1726853528.65412: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Friday 20 September 2024 13:32:08 -0400 (0:00:00.018) 0:00:05.057 ****** 24160 1726853528.65433: entering _queue_task() for managed_node1/fail 24160 1726853528.65609: worker is 1 (out of 1 available) 24160 1726853528.65621: exiting _queue_task() for managed_node1/fail 24160 1726853528.65632: done queuing things up, now waiting for results queue to drain 24160 1726853528.65634: waiting for pending results... 24160 1726853528.65794: running TaskExecutor() for managed_node1/TASK: Ensure type in ["dummy", "tap", "veth"] 24160 1726853528.65855: in run() - task 02083763-bbaf-5676-4eb4-0000000001af 24160 1726853528.65860: variable 'ansible_search_path' from source: unknown 24160 1726853528.65864: variable 'ansible_search_path' from source: unknown 24160 1726853528.65891: calling self._execute() 24160 1726853528.65960: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853528.65964: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853528.65967: variable 'omit' from source: magic vars 24160 1726853528.66215: variable 'ansible_distribution_major_version' from source: facts 24160 1726853528.66225: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853528.66318: variable 'type' from source: set_fact 24160 1726853528.66322: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 24160 1726853528.66325: when evaluation is False, skipping this task 24160 1726853528.66328: _execute() done 24160 1726853528.66330: dumping result to json 24160 1726853528.66335: done dumping result, returning 24160 1726853528.66340: done running TaskExecutor() for managed_node1/TASK: Ensure type in ["dummy", "tap", "veth"] [02083763-bbaf-5676-4eb4-0000000001af] 24160 1726853528.66345: sending task result for task 02083763-bbaf-5676-4eb4-0000000001af 24160 1726853528.66426: done sending task result for task 02083763-bbaf-5676-4eb4-0000000001af 24160 1726853528.66429: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 24160 1726853528.66483: no more pending results, returning what we have 24160 1726853528.66487: results queue empty 24160 1726853528.66488: checking for any_errors_fatal 24160 1726853528.66491: done checking for any_errors_fatal 24160 1726853528.66492: checking for max_fail_percentage 24160 1726853528.66493: done checking for max_fail_percentage 24160 1726853528.66494: checking to see if all hosts have failed and the running result is not ok 24160 1726853528.66494: done checking to see if all hosts have failed 24160 1726853528.66495: getting the remaining hosts for this loop 24160 1726853528.66496: done getting the remaining hosts for this loop 24160 1726853528.66499: getting the next task for host managed_node1 24160 1726853528.66516: done getting next task for host managed_node1 24160 1726853528.66518: ^ task is: TASK: Include the task 'show_interfaces.yml' 24160 1726853528.66521: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853528.66524: getting variables 24160 1726853528.66525: in VariableManager get_vars() 24160 1726853528.66557: Calling all_inventory to load vars for managed_node1 24160 1726853528.66560: Calling groups_inventory to load vars for managed_node1 24160 1726853528.66562: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853528.66576: Calling all_plugins_play to load vars for managed_node1 24160 1726853528.66578: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853528.66582: Calling groups_plugins_play to load vars for managed_node1 24160 1726853528.66703: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853528.66909: done with get_vars() 24160 1726853528.66919: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Friday 20 September 2024 13:32:08 -0400 (0:00:00.015) 0:00:05.072 ****** 24160 1726853528.67006: entering _queue_task() for managed_node1/include_tasks 24160 1726853528.67231: worker is 1 (out of 1 available) 24160 1726853528.67243: exiting _queue_task() for managed_node1/include_tasks 24160 1726853528.67257: done queuing things up, now waiting for results queue to drain 24160 1726853528.67259: waiting for pending results... 24160 1726853528.67657: running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' 24160 1726853528.67663: in run() - task 02083763-bbaf-5676-4eb4-0000000001b0 24160 1726853528.67666: variable 'ansible_search_path' from source: unknown 24160 1726853528.67669: variable 'ansible_search_path' from source: unknown 24160 1726853528.67676: calling self._execute() 24160 1726853528.67791: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853528.67795: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853528.67798: variable 'omit' from source: magic vars 24160 1726853528.68182: variable 'ansible_distribution_major_version' from source: facts 24160 1726853528.68224: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853528.68228: _execute() done 24160 1726853528.68232: dumping result to json 24160 1726853528.68234: done dumping result, returning 24160 1726853528.68237: done running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' [02083763-bbaf-5676-4eb4-0000000001b0] 24160 1726853528.68239: sending task result for task 02083763-bbaf-5676-4eb4-0000000001b0 24160 1726853528.68400: done sending task result for task 02083763-bbaf-5676-4eb4-0000000001b0 24160 1726853528.68404: WORKER PROCESS EXITING 24160 1726853528.68488: no more pending results, returning what we have 24160 1726853528.68494: in VariableManager get_vars() 24160 1726853528.68555: Calling all_inventory to load vars for managed_node1 24160 1726853528.68561: Calling groups_inventory to load vars for managed_node1 24160 1726853528.68567: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853528.68587: Calling all_plugins_play to load vars for managed_node1 24160 1726853528.68593: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853528.68596: Calling groups_plugins_play to load vars for managed_node1 24160 1726853528.68920: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853528.69126: done with get_vars() 24160 1726853528.69133: variable 'ansible_search_path' from source: unknown 24160 1726853528.69134: variable 'ansible_search_path' from source: unknown 24160 1726853528.69158: we have included files to process 24160 1726853528.69159: generating all_blocks data 24160 1726853528.69160: done generating all_blocks data 24160 1726853528.69162: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 24160 1726853528.69163: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 24160 1726853528.69164: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 24160 1726853528.69259: in VariableManager get_vars() 24160 1726853528.69275: done with get_vars() 24160 1726853528.69393: done processing included file 24160 1726853528.69395: iterating over new_blocks loaded from include file 24160 1726853528.69396: in VariableManager get_vars() 24160 1726853528.69406: done with get_vars() 24160 1726853528.69407: filtering new block on tags 24160 1726853528.69419: done filtering new block on tags 24160 1726853528.69420: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node1 24160 1726853528.69423: extending task lists for all hosts with included blocks 24160 1726853528.69812: done extending task lists 24160 1726853528.69814: done processing included files 24160 1726853528.69815: results queue empty 24160 1726853528.69815: checking for any_errors_fatal 24160 1726853528.69817: done checking for any_errors_fatal 24160 1726853528.69818: checking for max_fail_percentage 24160 1726853528.69819: done checking for max_fail_percentage 24160 1726853528.69820: checking to see if all hosts have failed and the running result is not ok 24160 1726853528.69821: done checking to see if all hosts have failed 24160 1726853528.69822: getting the remaining hosts for this loop 24160 1726853528.69823: done getting the remaining hosts for this loop 24160 1726853528.69825: getting the next task for host managed_node1 24160 1726853528.69829: done getting next task for host managed_node1 24160 1726853528.69832: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 24160 1726853528.69835: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853528.69837: getting variables 24160 1726853528.69838: in VariableManager get_vars() 24160 1726853528.69846: Calling all_inventory to load vars for managed_node1 24160 1726853528.69847: Calling groups_inventory to load vars for managed_node1 24160 1726853528.69848: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853528.69852: Calling all_plugins_play to load vars for managed_node1 24160 1726853528.69853: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853528.69856: Calling groups_plugins_play to load vars for managed_node1 24160 1726853528.69989: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853528.70188: done with get_vars() 24160 1726853528.70198: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 13:32:08 -0400 (0:00:00.032) 0:00:05.105 ****** 24160 1726853528.70257: entering _queue_task() for managed_node1/include_tasks 24160 1726853528.70620: worker is 1 (out of 1 available) 24160 1726853528.70634: exiting _queue_task() for managed_node1/include_tasks 24160 1726853528.70645: done queuing things up, now waiting for results queue to drain 24160 1726853528.70646: waiting for pending results... 24160 1726853528.71401: running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' 24160 1726853528.71406: in run() - task 02083763-bbaf-5676-4eb4-000000000245 24160 1726853528.71409: variable 'ansible_search_path' from source: unknown 24160 1726853528.71412: variable 'ansible_search_path' from source: unknown 24160 1726853528.71414: calling self._execute() 24160 1726853528.71602: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853528.71736: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853528.71740: variable 'omit' from source: magic vars 24160 1726853528.72842: variable 'ansible_distribution_major_version' from source: facts 24160 1726853528.72951: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853528.72965: _execute() done 24160 1726853528.73064: dumping result to json 24160 1726853528.73069: done dumping result, returning 24160 1726853528.73073: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' [02083763-bbaf-5676-4eb4-000000000245] 24160 1726853528.73075: sending task result for task 02083763-bbaf-5676-4eb4-000000000245 24160 1726853528.73147: done sending task result for task 02083763-bbaf-5676-4eb4-000000000245 24160 1726853528.73150: WORKER PROCESS EXITING 24160 1726853528.73201: no more pending results, returning what we have 24160 1726853528.73207: in VariableManager get_vars() 24160 1726853528.73258: Calling all_inventory to load vars for managed_node1 24160 1726853528.73262: Calling groups_inventory to load vars for managed_node1 24160 1726853528.73265: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853528.73290: Calling all_plugins_play to load vars for managed_node1 24160 1726853528.73294: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853528.73299: Calling groups_plugins_play to load vars for managed_node1 24160 1726853528.73641: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853528.73846: done with get_vars() 24160 1726853528.73855: variable 'ansible_search_path' from source: unknown 24160 1726853528.73856: variable 'ansible_search_path' from source: unknown 24160 1726853528.73915: we have included files to process 24160 1726853528.73916: generating all_blocks data 24160 1726853528.73918: done generating all_blocks data 24160 1726853528.73919: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 24160 1726853528.73920: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 24160 1726853528.73922: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 24160 1726853528.74169: done processing included file 24160 1726853528.74173: iterating over new_blocks loaded from include file 24160 1726853528.74174: in VariableManager get_vars() 24160 1726853528.74190: done with get_vars() 24160 1726853528.74192: filtering new block on tags 24160 1726853528.74209: done filtering new block on tags 24160 1726853528.74211: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node1 24160 1726853528.74216: extending task lists for all hosts with included blocks 24160 1726853528.74356: done extending task lists 24160 1726853528.74358: done processing included files 24160 1726853528.74359: results queue empty 24160 1726853528.74359: checking for any_errors_fatal 24160 1726853528.74362: done checking for any_errors_fatal 24160 1726853528.74363: checking for max_fail_percentage 24160 1726853528.74364: done checking for max_fail_percentage 24160 1726853528.74365: checking to see if all hosts have failed and the running result is not ok 24160 1726853528.74365: done checking to see if all hosts have failed 24160 1726853528.74366: getting the remaining hosts for this loop 24160 1726853528.74367: done getting the remaining hosts for this loop 24160 1726853528.74370: getting the next task for host managed_node1 24160 1726853528.74376: done getting next task for host managed_node1 24160 1726853528.74378: ^ task is: TASK: Gather current interface info 24160 1726853528.74381: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853528.74383: getting variables 24160 1726853528.74384: in VariableManager get_vars() 24160 1726853528.74395: Calling all_inventory to load vars for managed_node1 24160 1726853528.74397: Calling groups_inventory to load vars for managed_node1 24160 1726853528.74399: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853528.74403: Calling all_plugins_play to load vars for managed_node1 24160 1726853528.74405: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853528.74408: Calling groups_plugins_play to load vars for managed_node1 24160 1726853528.74574: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853528.74762: done with get_vars() 24160 1726853528.74772: done getting variables 24160 1726853528.74812: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 13:32:08 -0400 (0:00:00.045) 0:00:05.151 ****** 24160 1726853528.74845: entering _queue_task() for managed_node1/command 24160 1726853528.75152: worker is 1 (out of 1 available) 24160 1726853528.75164: exiting _queue_task() for managed_node1/command 24160 1726853528.75178: done queuing things up, now waiting for results queue to drain 24160 1726853528.75179: waiting for pending results... 24160 1726853528.75787: running TaskExecutor() for managed_node1/TASK: Gather current interface info 24160 1726853528.75792: in run() - task 02083763-bbaf-5676-4eb4-00000000027c 24160 1726853528.75796: variable 'ansible_search_path' from source: unknown 24160 1726853528.75798: variable 'ansible_search_path' from source: unknown 24160 1726853528.75801: calling self._execute() 24160 1726853528.75803: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853528.75806: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853528.75808: variable 'omit' from source: magic vars 24160 1726853528.76116: variable 'ansible_distribution_major_version' from source: facts 24160 1726853528.76137: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853528.76160: variable 'omit' from source: magic vars 24160 1726853528.76224: variable 'omit' from source: magic vars 24160 1726853528.76269: variable 'omit' from source: magic vars 24160 1726853528.76315: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24160 1726853528.76357: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24160 1726853528.76385: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24160 1726853528.76406: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853528.76422: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853528.76460: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24160 1726853528.76476: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853528.76484: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853528.76584: Set connection var ansible_shell_executable to /bin/sh 24160 1726853528.76595: Set connection var ansible_pipelining to False 24160 1726853528.76685: Set connection var ansible_connection to ssh 24160 1726853528.76689: Set connection var ansible_shell_type to sh 24160 1726853528.76691: Set connection var ansible_module_compression to ZIP_DEFLATED 24160 1726853528.76693: Set connection var ansible_timeout to 10 24160 1726853528.76695: variable 'ansible_shell_executable' from source: unknown 24160 1726853528.76697: variable 'ansible_connection' from source: unknown 24160 1726853528.76699: variable 'ansible_module_compression' from source: unknown 24160 1726853528.76701: variable 'ansible_shell_type' from source: unknown 24160 1726853528.76703: variable 'ansible_shell_executable' from source: unknown 24160 1726853528.76706: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853528.76708: variable 'ansible_pipelining' from source: unknown 24160 1726853528.76710: variable 'ansible_timeout' from source: unknown 24160 1726853528.76712: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853528.76843: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 24160 1726853528.76859: variable 'omit' from source: magic vars 24160 1726853528.76870: starting attempt loop 24160 1726853528.76880: running the handler 24160 1726853528.76902: _low_level_execute_command(): starting 24160 1726853528.76914: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24160 1726853528.77644: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853528.77678: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853528.77764: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853528.77885: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853528.79579: stdout chunk (state=3): >>>/root <<< 24160 1726853528.79735: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853528.79738: stdout chunk (state=3): >>><<< 24160 1726853528.79741: stderr chunk (state=3): >>><<< 24160 1726853528.79857: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853528.79861: _low_level_execute_command(): starting 24160 1726853528.79865: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853528.7976303-24428-231482256943791 `" && echo ansible-tmp-1726853528.7976303-24428-231482256943791="` echo /root/.ansible/tmp/ansible-tmp-1726853528.7976303-24428-231482256943791 `" ) && sleep 0' 24160 1726853528.80492: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853528.80530: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853528.80544: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853528.80605: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853528.82585: stdout chunk (state=3): >>>ansible-tmp-1726853528.7976303-24428-231482256943791=/root/.ansible/tmp/ansible-tmp-1726853528.7976303-24428-231482256943791 <<< 24160 1726853528.82712: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853528.82715: stdout chunk (state=3): >>><<< 24160 1726853528.82718: stderr chunk (state=3): >>><<< 24160 1726853528.82736: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853528.7976303-24428-231482256943791=/root/.ansible/tmp/ansible-tmp-1726853528.7976303-24428-231482256943791 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853528.82836: variable 'ansible_module_compression' from source: unknown 24160 1726853528.82840: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24160jdl187cr/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 24160 1726853528.82888: variable 'ansible_facts' from source: unknown 24160 1726853528.82952: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853528.7976303-24428-231482256943791/AnsiballZ_command.py 24160 1726853528.83076: Sending initial data 24160 1726853528.83112: Sent initial data (156 bytes) 24160 1726853528.83563: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853528.83579: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853528.83592: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853528.83638: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853528.83656: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853528.83698: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853528.85438: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 24160 1726853528.85443: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24160 1726853528.85482: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24160 1726853528.85581: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24160jdl187cr/tmp5dt3pjm4 /root/.ansible/tmp/ansible-tmp-1726853528.7976303-24428-231482256943791/AnsiballZ_command.py <<< 24160 1726853528.85584: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853528.7976303-24428-231482256943791/AnsiballZ_command.py" <<< 24160 1726853528.85775: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24160jdl187cr/tmp5dt3pjm4" to remote "/root/.ansible/tmp/ansible-tmp-1726853528.7976303-24428-231482256943791/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853528.7976303-24428-231482256943791/AnsiballZ_command.py" <<< 24160 1726853528.86734: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853528.86794: stderr chunk (state=3): >>><<< 24160 1726853528.86915: stdout chunk (state=3): >>><<< 24160 1726853528.86918: done transferring module to remote 24160 1726853528.86920: _low_level_execute_command(): starting 24160 1726853528.86922: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853528.7976303-24428-231482256943791/ /root/.ansible/tmp/ansible-tmp-1726853528.7976303-24428-231482256943791/AnsiballZ_command.py && sleep 0' 24160 1726853528.87488: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853528.87547: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853528.87552: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853528.87557: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853528.87797: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853528.89412: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853528.89439: stderr chunk (state=3): >>><<< 24160 1726853528.89453: stdout chunk (state=3): >>><<< 24160 1726853528.89480: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853528.89488: _low_level_execute_command(): starting 24160 1726853528.89497: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853528.7976303-24428-231482256943791/AnsiballZ_command.py && sleep 0' 24160 1726853528.90130: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853528.90180: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853528.90183: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853528.90185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24160 1726853528.90188: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853528.90190: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration <<< 24160 1726853528.90192: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853528.90232: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853528.90235: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853528.90238: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853528.90306: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853529.06058: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 13:32:09.056197", "end": "2024-09-20 13:32:09.059640", "delta": "0:00:00.003443", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 24160 1726853529.07698: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 24160 1726853529.07726: stderr chunk (state=3): >>><<< 24160 1726853529.07762: stdout chunk (state=3): >>><<< 24160 1726853529.07784: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 13:32:09.056197", "end": "2024-09-20 13:32:09.059640", "delta": "0:00:00.003443", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 24160 1726853529.07919: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853528.7976303-24428-231482256943791/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24160 1726853529.07923: _low_level_execute_command(): starting 24160 1726853529.07926: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853528.7976303-24428-231482256943791/ > /dev/null 2>&1 && sleep 0' 24160 1726853529.08659: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853529.08714: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853529.08789: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853529.08829: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853529.08864: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853529.08910: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853529.10825: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853529.10849: stderr chunk (state=3): >>><<< 24160 1726853529.10852: stdout chunk (state=3): >>><<< 24160 1726853529.10867: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853529.11078: handler run complete 24160 1726853529.11082: Evaluated conditional (False): False 24160 1726853529.11084: attempt loop complete, returning result 24160 1726853529.11087: _execute() done 24160 1726853529.11089: dumping result to json 24160 1726853529.11091: done dumping result, returning 24160 1726853529.11093: done running TaskExecutor() for managed_node1/TASK: Gather current interface info [02083763-bbaf-5676-4eb4-00000000027c] 24160 1726853529.11095: sending task result for task 02083763-bbaf-5676-4eb4-00000000027c 24160 1726853529.11174: done sending task result for task 02083763-bbaf-5676-4eb4-00000000027c 24160 1726853529.11177: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003443", "end": "2024-09-20 13:32:09.059640", "rc": 0, "start": "2024-09-20 13:32:09.056197" } STDOUT: bonding_masters eth0 lo 24160 1726853529.11261: no more pending results, returning what we have 24160 1726853529.11265: results queue empty 24160 1726853529.11266: checking for any_errors_fatal 24160 1726853529.11268: done checking for any_errors_fatal 24160 1726853529.11268: checking for max_fail_percentage 24160 1726853529.11270: done checking for max_fail_percentage 24160 1726853529.11274: checking to see if all hosts have failed and the running result is not ok 24160 1726853529.11274: done checking to see if all hosts have failed 24160 1726853529.11275: getting the remaining hosts for this loop 24160 1726853529.11277: done getting the remaining hosts for this loop 24160 1726853529.11280: getting the next task for host managed_node1 24160 1726853529.11290: done getting next task for host managed_node1 24160 1726853529.11293: ^ task is: TASK: Set current_interfaces 24160 1726853529.11299: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853529.11303: getting variables 24160 1726853529.11304: in VariableManager get_vars() 24160 1726853529.11345: Calling all_inventory to load vars for managed_node1 24160 1726853529.11348: Calling groups_inventory to load vars for managed_node1 24160 1726853529.11351: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853529.11362: Calling all_plugins_play to load vars for managed_node1 24160 1726853529.11365: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853529.11368: Calling groups_plugins_play to load vars for managed_node1 24160 1726853529.11760: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853529.12268: done with get_vars() 24160 1726853529.12281: done getting variables 24160 1726853529.12347: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 13:32:09 -0400 (0:00:00.375) 0:00:05.526 ****** 24160 1726853529.12389: entering _queue_task() for managed_node1/set_fact 24160 1726853529.12676: worker is 1 (out of 1 available) 24160 1726853529.12803: exiting _queue_task() for managed_node1/set_fact 24160 1726853529.12814: done queuing things up, now waiting for results queue to drain 24160 1726853529.12816: waiting for pending results... 24160 1726853529.13087: running TaskExecutor() for managed_node1/TASK: Set current_interfaces 24160 1726853529.13101: in run() - task 02083763-bbaf-5676-4eb4-00000000027d 24160 1726853529.13128: variable 'ansible_search_path' from source: unknown 24160 1726853529.13147: variable 'ansible_search_path' from source: unknown 24160 1726853529.13182: calling self._execute() 24160 1726853529.13343: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853529.13346: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853529.13349: variable 'omit' from source: magic vars 24160 1726853529.13681: variable 'ansible_distribution_major_version' from source: facts 24160 1726853529.13701: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853529.13711: variable 'omit' from source: magic vars 24160 1726853529.13763: variable 'omit' from source: magic vars 24160 1726853529.13891: variable '_current_interfaces' from source: set_fact 24160 1726853529.13960: variable 'omit' from source: magic vars 24160 1726853529.14015: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24160 1726853529.14053: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24160 1726853529.14105: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24160 1726853529.14109: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853529.14128: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853529.14161: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24160 1726853529.14170: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853529.14214: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853529.14293: Set connection var ansible_shell_executable to /bin/sh 24160 1726853529.14304: Set connection var ansible_pipelining to False 24160 1726853529.14311: Set connection var ansible_connection to ssh 24160 1726853529.14325: Set connection var ansible_shell_type to sh 24160 1726853529.14342: Set connection var ansible_module_compression to ZIP_DEFLATED 24160 1726853529.14375: Set connection var ansible_timeout to 10 24160 1726853529.14382: variable 'ansible_shell_executable' from source: unknown 24160 1726853529.14389: variable 'ansible_connection' from source: unknown 24160 1726853529.14396: variable 'ansible_module_compression' from source: unknown 24160 1726853529.14403: variable 'ansible_shell_type' from source: unknown 24160 1726853529.14431: variable 'ansible_shell_executable' from source: unknown 24160 1726853529.14434: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853529.14436: variable 'ansible_pipelining' from source: unknown 24160 1726853529.14438: variable 'ansible_timeout' from source: unknown 24160 1726853529.14443: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853529.14597: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 24160 1726853529.14649: variable 'omit' from source: magic vars 24160 1726853529.14653: starting attempt loop 24160 1726853529.14656: running the handler 24160 1726853529.14658: handler run complete 24160 1726853529.14662: attempt loop complete, returning result 24160 1726853529.14670: _execute() done 24160 1726853529.14759: dumping result to json 24160 1726853529.14762: done dumping result, returning 24160 1726853529.14765: done running TaskExecutor() for managed_node1/TASK: Set current_interfaces [02083763-bbaf-5676-4eb4-00000000027d] 24160 1726853529.14768: sending task result for task 02083763-bbaf-5676-4eb4-00000000027d 24160 1726853529.14832: done sending task result for task 02083763-bbaf-5676-4eb4-00000000027d 24160 1726853529.14836: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 24160 1726853529.14903: no more pending results, returning what we have 24160 1726853529.14907: results queue empty 24160 1726853529.14908: checking for any_errors_fatal 24160 1726853529.14915: done checking for any_errors_fatal 24160 1726853529.14916: checking for max_fail_percentage 24160 1726853529.14918: done checking for max_fail_percentage 24160 1726853529.14920: checking to see if all hosts have failed and the running result is not ok 24160 1726853529.14921: done checking to see if all hosts have failed 24160 1726853529.14922: getting the remaining hosts for this loop 24160 1726853529.14923: done getting the remaining hosts for this loop 24160 1726853529.14927: getting the next task for host managed_node1 24160 1726853529.14936: done getting next task for host managed_node1 24160 1726853529.14939: ^ task is: TASK: Show current_interfaces 24160 1726853529.14943: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853529.14948: getting variables 24160 1726853529.14949: in VariableManager get_vars() 24160 1726853529.14988: Calling all_inventory to load vars for managed_node1 24160 1726853529.14991: Calling groups_inventory to load vars for managed_node1 24160 1726853529.14993: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853529.15003: Calling all_plugins_play to load vars for managed_node1 24160 1726853529.15006: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853529.15009: Calling groups_plugins_play to load vars for managed_node1 24160 1726853529.15417: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853529.15601: done with get_vars() 24160 1726853529.15621: done getting variables 24160 1726853529.15681: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 13:32:09 -0400 (0:00:00.033) 0:00:05.559 ****** 24160 1726853529.15712: entering _queue_task() for managed_node1/debug 24160 1726853529.16019: worker is 1 (out of 1 available) 24160 1726853529.16032: exiting _queue_task() for managed_node1/debug 24160 1726853529.16162: done queuing things up, now waiting for results queue to drain 24160 1726853529.16164: waiting for pending results... 24160 1726853529.16395: running TaskExecutor() for managed_node1/TASK: Show current_interfaces 24160 1726853529.16419: in run() - task 02083763-bbaf-5676-4eb4-000000000246 24160 1726853529.16478: variable 'ansible_search_path' from source: unknown 24160 1726853529.16489: variable 'ansible_search_path' from source: unknown 24160 1726853529.16499: calling self._execute() 24160 1726853529.16584: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853529.16604: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853529.16622: variable 'omit' from source: magic vars 24160 1726853529.17035: variable 'ansible_distribution_major_version' from source: facts 24160 1726853529.17039: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853529.17041: variable 'omit' from source: magic vars 24160 1726853529.17094: variable 'omit' from source: magic vars 24160 1726853529.17203: variable 'current_interfaces' from source: set_fact 24160 1726853529.17253: variable 'omit' from source: magic vars 24160 1726853529.17293: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24160 1726853529.17362: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24160 1726853529.17365: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24160 1726853529.17389: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853529.17406: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853529.17440: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24160 1726853529.17470: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853529.17475: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853529.17591: Set connection var ansible_shell_executable to /bin/sh 24160 1726853529.17598: Set connection var ansible_pipelining to False 24160 1726853529.17602: Set connection var ansible_connection to ssh 24160 1726853529.17677: Set connection var ansible_shell_type to sh 24160 1726853529.17688: Set connection var ansible_module_compression to ZIP_DEFLATED 24160 1726853529.17691: Set connection var ansible_timeout to 10 24160 1726853529.17693: variable 'ansible_shell_executable' from source: unknown 24160 1726853529.17696: variable 'ansible_connection' from source: unknown 24160 1726853529.17698: variable 'ansible_module_compression' from source: unknown 24160 1726853529.17700: variable 'ansible_shell_type' from source: unknown 24160 1726853529.17703: variable 'ansible_shell_executable' from source: unknown 24160 1726853529.17705: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853529.17707: variable 'ansible_pipelining' from source: unknown 24160 1726853529.17709: variable 'ansible_timeout' from source: unknown 24160 1726853529.17711: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853529.17860: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 24160 1726853529.17879: variable 'omit' from source: magic vars 24160 1726853529.17905: starting attempt loop 24160 1726853529.17908: running the handler 24160 1726853529.17954: handler run complete 24160 1726853529.18014: attempt loop complete, returning result 24160 1726853529.18017: _execute() done 24160 1726853529.18019: dumping result to json 24160 1726853529.18021: done dumping result, returning 24160 1726853529.18023: done running TaskExecutor() for managed_node1/TASK: Show current_interfaces [02083763-bbaf-5676-4eb4-000000000246] 24160 1726853529.18025: sending task result for task 02083763-bbaf-5676-4eb4-000000000246 ok: [managed_node1] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 24160 1726853529.18257: no more pending results, returning what we have 24160 1726853529.18261: results queue empty 24160 1726853529.18263: checking for any_errors_fatal 24160 1726853529.18267: done checking for any_errors_fatal 24160 1726853529.18268: checking for max_fail_percentage 24160 1726853529.18270: done checking for max_fail_percentage 24160 1726853529.18272: checking to see if all hosts have failed and the running result is not ok 24160 1726853529.18273: done checking to see if all hosts have failed 24160 1726853529.18274: getting the remaining hosts for this loop 24160 1726853529.18275: done getting the remaining hosts for this loop 24160 1726853529.18336: getting the next task for host managed_node1 24160 1726853529.18345: done getting next task for host managed_node1 24160 1726853529.18348: ^ task is: TASK: Install iproute 24160 1726853529.18351: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853529.18356: getting variables 24160 1726853529.18358: in VariableManager get_vars() 24160 1726853529.18434: Calling all_inventory to load vars for managed_node1 24160 1726853529.18436: Calling groups_inventory to load vars for managed_node1 24160 1726853529.18439: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853529.18448: Calling all_plugins_play to load vars for managed_node1 24160 1726853529.18450: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853529.18456: Calling groups_plugins_play to load vars for managed_node1 24160 1726853529.18618: done sending task result for task 02083763-bbaf-5676-4eb4-000000000246 24160 1726853529.18622: WORKER PROCESS EXITING 24160 1726853529.18634: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853529.18753: done with get_vars() 24160 1726853529.18763: done getting variables 24160 1726853529.18811: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Friday 20 September 2024 13:32:09 -0400 (0:00:00.031) 0:00:05.591 ****** 24160 1726853529.18834: entering _queue_task() for managed_node1/package 24160 1726853529.19050: worker is 1 (out of 1 available) 24160 1726853529.19066: exiting _queue_task() for managed_node1/package 24160 1726853529.19080: done queuing things up, now waiting for results queue to drain 24160 1726853529.19082: waiting for pending results... 24160 1726853529.19231: running TaskExecutor() for managed_node1/TASK: Install iproute 24160 1726853529.19289: in run() - task 02083763-bbaf-5676-4eb4-0000000001b1 24160 1726853529.19302: variable 'ansible_search_path' from source: unknown 24160 1726853529.19306: variable 'ansible_search_path' from source: unknown 24160 1726853529.19336: calling self._execute() 24160 1726853529.19397: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853529.19401: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853529.19410: variable 'omit' from source: magic vars 24160 1726853529.19685: variable 'ansible_distribution_major_version' from source: facts 24160 1726853529.19696: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853529.19702: variable 'omit' from source: magic vars 24160 1726853529.19726: variable 'omit' from source: magic vars 24160 1726853529.19861: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24160 1726853529.21578: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24160 1726853529.21582: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24160 1726853529.21585: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24160 1726853529.21587: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24160 1726853529.21588: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24160 1726853529.21634: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853529.21665: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853529.21698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853529.21742: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853529.21762: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853529.21873: variable '__network_is_ostree' from source: set_fact 24160 1726853529.21884: variable 'omit' from source: magic vars 24160 1726853529.21915: variable 'omit' from source: magic vars 24160 1726853529.21946: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24160 1726853529.21982: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24160 1726853529.22010: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24160 1726853529.22030: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853529.22038: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853529.22095: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24160 1726853529.22098: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853529.22100: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853529.22161: Set connection var ansible_shell_executable to /bin/sh 24160 1726853529.22165: Set connection var ansible_pipelining to False 24160 1726853529.22167: Set connection var ansible_connection to ssh 24160 1726853529.22170: Set connection var ansible_shell_type to sh 24160 1726853529.22177: Set connection var ansible_module_compression to ZIP_DEFLATED 24160 1726853529.22265: Set connection var ansible_timeout to 10 24160 1726853529.22269: variable 'ansible_shell_executable' from source: unknown 24160 1726853529.22290: variable 'ansible_connection' from source: unknown 24160 1726853529.22296: variable 'ansible_module_compression' from source: unknown 24160 1726853529.22299: variable 'ansible_shell_type' from source: unknown 24160 1726853529.22302: variable 'ansible_shell_executable' from source: unknown 24160 1726853529.22304: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853529.22306: variable 'ansible_pipelining' from source: unknown 24160 1726853529.22308: variable 'ansible_timeout' from source: unknown 24160 1726853529.22310: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853529.22313: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 24160 1726853529.22316: variable 'omit' from source: magic vars 24160 1726853529.22318: starting attempt loop 24160 1726853529.22320: running the handler 24160 1726853529.22322: variable 'ansible_facts' from source: unknown 24160 1726853529.22325: variable 'ansible_facts' from source: unknown 24160 1726853529.22342: _low_level_execute_command(): starting 24160 1726853529.22348: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24160 1726853529.22848: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853529.22851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853529.22855: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 24160 1726853529.22858: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853529.22913: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853529.22921: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853529.22923: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853529.22969: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853529.24669: stdout chunk (state=3): >>>/root <<< 24160 1726853529.24770: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853529.24811: stderr chunk (state=3): >>><<< 24160 1726853529.24813: stdout chunk (state=3): >>><<< 24160 1726853529.24828: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853529.24842: _low_level_execute_command(): starting 24160 1726853529.24846: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853529.2482882-24465-55531906911429 `" && echo ansible-tmp-1726853529.2482882-24465-55531906911429="` echo /root/.ansible/tmp/ansible-tmp-1726853529.2482882-24465-55531906911429 `" ) && sleep 0' 24160 1726853529.25438: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 24160 1726853529.25441: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853529.25482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853529.25526: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853529.27464: stdout chunk (state=3): >>>ansible-tmp-1726853529.2482882-24465-55531906911429=/root/.ansible/tmp/ansible-tmp-1726853529.2482882-24465-55531906911429 <<< 24160 1726853529.27566: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853529.27595: stderr chunk (state=3): >>><<< 24160 1726853529.27598: stdout chunk (state=3): >>><<< 24160 1726853529.27613: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853529.2482882-24465-55531906911429=/root/.ansible/tmp/ansible-tmp-1726853529.2482882-24465-55531906911429 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853529.27639: variable 'ansible_module_compression' from source: unknown 24160 1726853529.27691: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 24160 1726853529.27695: ANSIBALLZ: Acquiring lock 24160 1726853529.27697: ANSIBALLZ: Lock acquired: 140302803944608 24160 1726853529.27699: ANSIBALLZ: Creating module 24160 1726853529.38496: ANSIBALLZ: Writing module into payload 24160 1726853529.38635: ANSIBALLZ: Writing module 24160 1726853529.38656: ANSIBALLZ: Renaming module 24160 1726853529.38665: ANSIBALLZ: Done creating module 24160 1726853529.38684: variable 'ansible_facts' from source: unknown 24160 1726853529.38741: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853529.2482882-24465-55531906911429/AnsiballZ_dnf.py 24160 1726853529.38841: Sending initial data 24160 1726853529.38844: Sent initial data (151 bytes) 24160 1726853529.39312: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853529.39315: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 24160 1726853529.39317: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration <<< 24160 1726853529.39319: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853529.39322: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853529.39375: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853529.39379: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853529.39438: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853529.41088: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 24160 1726853529.41092: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24160 1726853529.41118: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24160 1726853529.41153: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24160jdl187cr/tmpqzsawczu /root/.ansible/tmp/ansible-tmp-1726853529.2482882-24465-55531906911429/AnsiballZ_dnf.py <<< 24160 1726853529.41162: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853529.2482882-24465-55531906911429/AnsiballZ_dnf.py" <<< 24160 1726853529.41202: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24160jdl187cr/tmpqzsawczu" to remote "/root/.ansible/tmp/ansible-tmp-1726853529.2482882-24465-55531906911429/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853529.2482882-24465-55531906911429/AnsiballZ_dnf.py" <<< 24160 1726853529.41916: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853529.41982: stderr chunk (state=3): >>><<< 24160 1726853529.41996: stdout chunk (state=3): >>><<< 24160 1726853529.42021: done transferring module to remote 24160 1726853529.42035: _low_level_execute_command(): starting 24160 1726853529.42120: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853529.2482882-24465-55531906911429/ /root/.ansible/tmp/ansible-tmp-1726853529.2482882-24465-55531906911429/AnsiballZ_dnf.py && sleep 0' 24160 1726853529.42684: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853529.42704: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853529.42725: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853529.42793: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853529.42850: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853529.42866: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853529.42895: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853529.42958: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853529.44735: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853529.44742: stdout chunk (state=3): >>><<< 24160 1726853529.44750: stderr chunk (state=3): >>><<< 24160 1726853529.44772: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853529.44780: _low_level_execute_command(): starting 24160 1726853529.44785: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853529.2482882-24465-55531906911429/AnsiballZ_dnf.py && sleep 0' 24160 1726853529.45223: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853529.45226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853529.45229: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853529.45232: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 24160 1726853529.45234: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853529.45276: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853529.45296: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853529.45333: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853529.86291: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 24160 1726853529.90562: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 24160 1726853529.90566: stdout chunk (state=3): >>><<< 24160 1726853529.90569: stderr chunk (state=3): >>><<< 24160 1726853529.90574: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 24160 1726853529.90581: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853529.2482882-24465-55531906911429/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24160 1726853529.90584: _low_level_execute_command(): starting 24160 1726853529.90586: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853529.2482882-24465-55531906911429/ > /dev/null 2>&1 && sleep 0' 24160 1726853529.91147: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853529.91164: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853529.91185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853529.91204: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24160 1726853529.91222: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 24160 1726853529.91266: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853529.91331: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853529.91348: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853529.91387: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853529.91452: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853529.93342: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853529.93346: stdout chunk (state=3): >>><<< 24160 1726853529.93349: stderr chunk (state=3): >>><<< 24160 1726853529.93366: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853529.93482: handler run complete 24160 1726853529.93559: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24160 1726853529.93740: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24160 1726853529.93785: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24160 1726853529.93820: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24160 1726853529.93854: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24160 1726853529.93926: variable '__install_status' from source: unknown 24160 1726853529.93950: Evaluated conditional (__install_status is success): True 24160 1726853529.93972: attempt loop complete, returning result 24160 1726853529.93981: _execute() done 24160 1726853529.93988: dumping result to json 24160 1726853529.93998: done dumping result, returning 24160 1726853529.94008: done running TaskExecutor() for managed_node1/TASK: Install iproute [02083763-bbaf-5676-4eb4-0000000001b1] 24160 1726853529.94016: sending task result for task 02083763-bbaf-5676-4eb4-0000000001b1 ok: [managed_node1] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 24160 1726853529.94211: no more pending results, returning what we have 24160 1726853529.94215: results queue empty 24160 1726853529.94216: checking for any_errors_fatal 24160 1726853529.94219: done checking for any_errors_fatal 24160 1726853529.94220: checking for max_fail_percentage 24160 1726853529.94222: done checking for max_fail_percentage 24160 1726853529.94222: checking to see if all hosts have failed and the running result is not ok 24160 1726853529.94223: done checking to see if all hosts have failed 24160 1726853529.94224: getting the remaining hosts for this loop 24160 1726853529.94225: done getting the remaining hosts for this loop 24160 1726853529.94228: getting the next task for host managed_node1 24160 1726853529.94236: done getting next task for host managed_node1 24160 1726853529.94239: ^ task is: TASK: Create veth interface {{ interface }} 24160 1726853529.94241: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853529.94245: getting variables 24160 1726853529.94246: in VariableManager get_vars() 24160 1726853529.94284: Calling all_inventory to load vars for managed_node1 24160 1726853529.94287: Calling groups_inventory to load vars for managed_node1 24160 1726853529.94289: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853529.94297: done sending task result for task 02083763-bbaf-5676-4eb4-0000000001b1 24160 1726853529.94300: WORKER PROCESS EXITING 24160 1726853529.94387: Calling all_plugins_play to load vars for managed_node1 24160 1726853529.94391: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853529.94396: Calling groups_plugins_play to load vars for managed_node1 24160 1726853529.94742: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853529.94942: done with get_vars() 24160 1726853529.94954: done getting variables 24160 1726853529.95015: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 24160 1726853529.95134: variable 'interface' from source: set_fact TASK [Create veth interface ethtest0] ****************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Friday 20 September 2024 13:32:09 -0400 (0:00:00.763) 0:00:06.354 ****** 24160 1726853529.95163: entering _queue_task() for managed_node1/command 24160 1726853529.95415: worker is 1 (out of 1 available) 24160 1726853529.95428: exiting _queue_task() for managed_node1/command 24160 1726853529.95440: done queuing things up, now waiting for results queue to drain 24160 1726853529.95441: waiting for pending results... 24160 1726853529.95689: running TaskExecutor() for managed_node1/TASK: Create veth interface ethtest0 24160 1726853529.95789: in run() - task 02083763-bbaf-5676-4eb4-0000000001b2 24160 1726853529.95808: variable 'ansible_search_path' from source: unknown 24160 1726853529.95815: variable 'ansible_search_path' from source: unknown 24160 1726853529.96065: variable 'interface' from source: set_fact 24160 1726853529.96154: variable 'interface' from source: set_fact 24160 1726853529.96234: variable 'interface' from source: set_fact 24160 1726853529.96377: Loaded config def from plugin (lookup/items) 24160 1726853529.96389: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 24160 1726853529.96417: variable 'omit' from source: magic vars 24160 1726853529.96536: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853529.96557: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853529.96575: variable 'omit' from source: magic vars 24160 1726853529.96847: variable 'ansible_distribution_major_version' from source: facts 24160 1726853529.96859: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853529.97047: variable 'type' from source: set_fact 24160 1726853529.97056: variable 'state' from source: include params 24160 1726853529.97063: variable 'interface' from source: set_fact 24160 1726853529.97070: variable 'current_interfaces' from source: set_fact 24160 1726853529.97083: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 24160 1726853529.97095: variable 'omit' from source: magic vars 24160 1726853529.97132: variable 'omit' from source: magic vars 24160 1726853529.97178: variable 'item' from source: unknown 24160 1726853529.97246: variable 'item' from source: unknown 24160 1726853529.97266: variable 'omit' from source: magic vars 24160 1726853529.97306: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24160 1726853529.97476: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24160 1726853529.97479: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24160 1726853529.97482: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853529.97484: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853529.97486: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24160 1726853529.97488: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853529.97490: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853529.97537: Set connection var ansible_shell_executable to /bin/sh 24160 1726853529.97549: Set connection var ansible_pipelining to False 24160 1726853529.97555: Set connection var ansible_connection to ssh 24160 1726853529.97562: Set connection var ansible_shell_type to sh 24160 1726853529.97575: Set connection var ansible_module_compression to ZIP_DEFLATED 24160 1726853529.97588: Set connection var ansible_timeout to 10 24160 1726853529.97613: variable 'ansible_shell_executable' from source: unknown 24160 1726853529.97621: variable 'ansible_connection' from source: unknown 24160 1726853529.97627: variable 'ansible_module_compression' from source: unknown 24160 1726853529.97633: variable 'ansible_shell_type' from source: unknown 24160 1726853529.97639: variable 'ansible_shell_executable' from source: unknown 24160 1726853529.97645: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853529.97653: variable 'ansible_pipelining' from source: unknown 24160 1726853529.97659: variable 'ansible_timeout' from source: unknown 24160 1726853529.97666: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853529.97800: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 24160 1726853529.97817: variable 'omit' from source: magic vars 24160 1726853529.97831: starting attempt loop 24160 1726853529.97838: running the handler 24160 1726853529.97855: _low_level_execute_command(): starting 24160 1726853529.97868: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24160 1726853529.98565: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853529.98586: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853529.98599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853529.98696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853529.98721: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853529.98738: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853529.98762: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853529.98838: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853530.00537: stdout chunk (state=3): >>>/root <<< 24160 1726853530.00696: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853530.00699: stdout chunk (state=3): >>><<< 24160 1726853530.00703: stderr chunk (state=3): >>><<< 24160 1726853530.00913: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853530.00923: _low_level_execute_command(): starting 24160 1726853530.00925: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853530.0074944-24487-45429292740991 `" && echo ansible-tmp-1726853530.0074944-24487-45429292740991="` echo /root/.ansible/tmp/ansible-tmp-1726853530.0074944-24487-45429292740991 `" ) && sleep 0' 24160 1726853530.01564: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853530.01568: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 24160 1726853530.01570: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853530.01575: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853530.01577: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 24160 1726853530.01580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853530.01632: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853530.01640: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853530.01690: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853530.03592: stdout chunk (state=3): >>>ansible-tmp-1726853530.0074944-24487-45429292740991=/root/.ansible/tmp/ansible-tmp-1726853530.0074944-24487-45429292740991 <<< 24160 1726853530.03851: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853530.03857: stdout chunk (state=3): >>><<< 24160 1726853530.03860: stderr chunk (state=3): >>><<< 24160 1726853530.04004: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853530.0074944-24487-45429292740991=/root/.ansible/tmp/ansible-tmp-1726853530.0074944-24487-45429292740991 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853530.04008: variable 'ansible_module_compression' from source: unknown 24160 1726853530.04010: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24160jdl187cr/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 24160 1726853530.04099: variable 'ansible_facts' from source: unknown 24160 1726853530.04310: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853530.0074944-24487-45429292740991/AnsiballZ_command.py 24160 1726853530.04692: Sending initial data 24160 1726853530.04791: Sent initial data (155 bytes) 24160 1726853530.06191: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853530.06267: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853530.06299: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853530.06328: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853530.06427: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853530.07981: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24160 1726853530.08033: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24160 1726853530.08338: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24160jdl187cr/tmp5k_a__tc /root/.ansible/tmp/ansible-tmp-1726853530.0074944-24487-45429292740991/AnsiballZ_command.py <<< 24160 1726853530.08342: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853530.0074944-24487-45429292740991/AnsiballZ_command.py" <<< 24160 1726853530.08424: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24160jdl187cr/tmp5k_a__tc" to remote "/root/.ansible/tmp/ansible-tmp-1726853530.0074944-24487-45429292740991/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853530.0074944-24487-45429292740991/AnsiballZ_command.py" <<< 24160 1726853530.09792: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853530.09801: stdout chunk (state=3): >>><<< 24160 1726853530.09824: stderr chunk (state=3): >>><<< 24160 1726853530.09885: done transferring module to remote 24160 1726853530.09900: _low_level_execute_command(): starting 24160 1726853530.09909: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853530.0074944-24487-45429292740991/ /root/.ansible/tmp/ansible-tmp-1726853530.0074944-24487-45429292740991/AnsiballZ_command.py && sleep 0' 24160 1726853530.10590: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853530.10593: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 24160 1726853530.10596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853530.10598: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853530.10600: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853530.10665: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853530.10701: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853530.12729: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853530.12733: stdout chunk (state=3): >>><<< 24160 1726853530.12735: stderr chunk (state=3): >>><<< 24160 1726853530.12737: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853530.12739: _low_level_execute_command(): starting 24160 1726853530.12741: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853530.0074944-24487-45429292740991/AnsiballZ_command.py && sleep 0' 24160 1726853530.13667: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853530.13976: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853530.13992: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853530.14009: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853530.14086: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853530.30185: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "ethtest0", "type", "veth", "peer", "name", "peerethtest0"], "start": "2024-09-20 13:32:10.294462", "end": "2024-09-20 13:32:10.299431", "delta": "0:00:00.004969", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add ethtest0 type veth peer name peerethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 24160 1726853530.32264: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853530.32285: stderr chunk (state=3): >>>Shared connection to 10.31.45.153 closed. <<< 24160 1726853530.32350: stderr chunk (state=3): >>><<< 24160 1726853530.32362: stdout chunk (state=3): >>><<< 24160 1726853530.32389: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "ethtest0", "type", "veth", "peer", "name", "peerethtest0"], "start": "2024-09-20 13:32:10.294462", "end": "2024-09-20 13:32:10.299431", "delta": "0:00:00.004969", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add ethtest0 type veth peer name peerethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 24160 1726853530.32432: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link add ethtest0 type veth peer name peerethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853530.0074944-24487-45429292740991/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24160 1726853530.32460: _low_level_execute_command(): starting 24160 1726853530.32474: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853530.0074944-24487-45429292740991/ > /dev/null 2>&1 && sleep 0' 24160 1726853530.33101: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853530.33104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 24160 1726853530.33106: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853530.33108: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853530.33117: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853530.33172: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853530.33176: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853530.33259: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853530.38303: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853530.38336: stderr chunk (state=3): >>><<< 24160 1726853530.38389: stdout chunk (state=3): >>><<< 24160 1726853530.38393: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853530.38497: handler run complete 24160 1726853530.38500: Evaluated conditional (False): False 24160 1726853530.38504: attempt loop complete, returning result 24160 1726853530.38507: variable 'item' from source: unknown 24160 1726853530.38691: variable 'item' from source: unknown ok: [managed_node1] => (item=ip link add ethtest0 type veth peer name peerethtest0) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "add", "ethtest0", "type", "veth", "peer", "name", "peerethtest0" ], "delta": "0:00:00.004969", "end": "2024-09-20 13:32:10.299431", "item": "ip link add ethtest0 type veth peer name peerethtest0", "rc": 0, "start": "2024-09-20 13:32:10.294462" } 24160 1726853530.39148: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853530.39151: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853530.39155: variable 'omit' from source: magic vars 24160 1726853530.39236: variable 'ansible_distribution_major_version' from source: facts 24160 1726853530.39246: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853530.39780: variable 'type' from source: set_fact 24160 1726853530.39785: variable 'state' from source: include params 24160 1726853530.39787: variable 'interface' from source: set_fact 24160 1726853530.39789: variable 'current_interfaces' from source: set_fact 24160 1726853530.39792: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 24160 1726853530.39794: variable 'omit' from source: magic vars 24160 1726853530.39796: variable 'omit' from source: magic vars 24160 1726853530.39798: variable 'item' from source: unknown 24160 1726853530.39887: variable 'item' from source: unknown 24160 1726853530.39914: variable 'omit' from source: magic vars 24160 1726853530.39948: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24160 1726853530.39977: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853530.39980: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853530.40022: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24160 1726853530.40025: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853530.40028: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853530.40117: Set connection var ansible_shell_executable to /bin/sh 24160 1726853530.40147: Set connection var ansible_pipelining to False 24160 1726853530.40150: Set connection var ansible_connection to ssh 24160 1726853530.40152: Set connection var ansible_shell_type to sh 24160 1726853530.40219: Set connection var ansible_module_compression to ZIP_DEFLATED 24160 1726853530.40222: Set connection var ansible_timeout to 10 24160 1726853530.40224: variable 'ansible_shell_executable' from source: unknown 24160 1726853530.40226: variable 'ansible_connection' from source: unknown 24160 1726853530.40228: variable 'ansible_module_compression' from source: unknown 24160 1726853530.40231: variable 'ansible_shell_type' from source: unknown 24160 1726853530.40237: variable 'ansible_shell_executable' from source: unknown 24160 1726853530.40239: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853530.40241: variable 'ansible_pipelining' from source: unknown 24160 1726853530.40243: variable 'ansible_timeout' from source: unknown 24160 1726853530.40245: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853530.40366: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 24160 1726853530.40384: variable 'omit' from source: magic vars 24160 1726853530.40393: starting attempt loop 24160 1726853530.40400: running the handler 24160 1726853530.40437: _low_level_execute_command(): starting 24160 1726853530.40440: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24160 1726853530.41186: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853530.41227: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24160 1726853530.41493: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853530.41527: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853530.41546: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853530.41618: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853530.43283: stdout chunk (state=3): >>>/root <<< 24160 1726853530.43629: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853530.43632: stdout chunk (state=3): >>><<< 24160 1726853530.43634: stderr chunk (state=3): >>><<< 24160 1726853530.43637: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853530.43639: _low_level_execute_command(): starting 24160 1726853530.43641: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853530.4355125-24487-173598202895365 `" && echo ansible-tmp-1726853530.4355125-24487-173598202895365="` echo /root/.ansible/tmp/ansible-tmp-1726853530.4355125-24487-173598202895365 `" ) && sleep 0' 24160 1726853530.44549: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24160 1726853530.44565: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853530.44578: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853530.44628: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853530.44648: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853530.44699: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853530.44730: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853530.46712: stdout chunk (state=3): >>>ansible-tmp-1726853530.4355125-24487-173598202895365=/root/.ansible/tmp/ansible-tmp-1726853530.4355125-24487-173598202895365 <<< 24160 1726853530.46783: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853530.46796: stderr chunk (state=3): >>><<< 24160 1726853530.46808: stdout chunk (state=3): >>><<< 24160 1726853530.46976: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853530.4355125-24487-173598202895365=/root/.ansible/tmp/ansible-tmp-1726853530.4355125-24487-173598202895365 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853530.46979: variable 'ansible_module_compression' from source: unknown 24160 1726853530.46981: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24160jdl187cr/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 24160 1726853530.46983: variable 'ansible_facts' from source: unknown 24160 1726853530.46995: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853530.4355125-24487-173598202895365/AnsiballZ_command.py 24160 1726853530.47198: Sending initial data 24160 1726853530.47201: Sent initial data (156 bytes) 24160 1726853530.47819: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853530.47843: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853530.47914: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853530.49459: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24160 1726853530.49687: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24160 1726853530.49716: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24160jdl187cr/tmpredoo8hm /root/.ansible/tmp/ansible-tmp-1726853530.4355125-24487-173598202895365/AnsiballZ_command.py <<< 24160 1726853530.49720: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853530.4355125-24487-173598202895365/AnsiballZ_command.py" <<< 24160 1726853530.49843: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24160jdl187cr/tmpredoo8hm" to remote "/root/.ansible/tmp/ansible-tmp-1726853530.4355125-24487-173598202895365/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853530.4355125-24487-173598202895365/AnsiballZ_command.py" <<< 24160 1726853530.50634: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853530.50688: stderr chunk (state=3): >>><<< 24160 1726853530.50691: stdout chunk (state=3): >>><<< 24160 1726853530.50710: done transferring module to remote 24160 1726853530.50719: _low_level_execute_command(): starting 24160 1726853530.50723: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853530.4355125-24487-173598202895365/ /root/.ansible/tmp/ansible-tmp-1726853530.4355125-24487-173598202895365/AnsiballZ_command.py && sleep 0' 24160 1726853530.51302: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853530.51312: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853530.51320: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853530.51344: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24160 1726853530.51385: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853530.51444: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853530.51459: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853530.51479: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853530.51539: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853530.53334: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853530.53338: stdout chunk (state=3): >>><<< 24160 1726853530.53346: stderr chunk (state=3): >>><<< 24160 1726853530.53359: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853530.53362: _low_level_execute_command(): starting 24160 1726853530.53373: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853530.4355125-24487-173598202895365/AnsiballZ_command.py && sleep 0' 24160 1726853530.53802: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853530.53805: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853530.53808: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853530.53810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 24160 1726853530.53812: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853530.53859: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853530.53863: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853530.53913: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853530.69565: stdout chunk (state=3): >>> <<< 24160 1726853530.69570: stdout chunk (state=3): >>>{"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerethtest0", "up"], "start": "2024-09-20 13:32:10.690887", "end": "2024-09-20 13:32:10.694755", "delta": "0:00:00.003868", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 24160 1726853530.71186: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 24160 1726853530.71194: stdout chunk (state=3): >>><<< 24160 1726853530.71197: stderr chunk (state=3): >>><<< 24160 1726853530.71208: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerethtest0", "up"], "start": "2024-09-20 13:32:10.690887", "end": "2024-09-20 13:32:10.694755", "delta": "0:00:00.003868", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 24160 1726853530.71240: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set peerethtest0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853530.4355125-24487-173598202895365/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24160 1726853530.71276: _low_level_execute_command(): starting 24160 1726853530.71280: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853530.4355125-24487-173598202895365/ > /dev/null 2>&1 && sleep 0' 24160 1726853530.72110: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853530.72121: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853530.72137: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853530.72186: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24160 1726853530.72190: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 24160 1726853530.72192: stderr chunk (state=3): >>>debug2: match not found <<< 24160 1726853530.72195: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853530.72197: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24160 1726853530.72251: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853530.72286: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853530.72336: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853530.72340: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853530.72388: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853530.74377: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853530.74402: stdout chunk (state=3): >>><<< 24160 1726853530.74406: stderr chunk (state=3): >>><<< 24160 1726853530.74408: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853530.74410: handler run complete 24160 1726853530.74412: Evaluated conditional (False): False 24160 1726853530.74416: attempt loop complete, returning result 24160 1726853530.74418: variable 'item' from source: unknown 24160 1726853530.74460: variable 'item' from source: unknown ok: [managed_node1] => (item=ip link set peerethtest0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "peerethtest0", "up" ], "delta": "0:00:00.003868", "end": "2024-09-20 13:32:10.694755", "item": "ip link set peerethtest0 up", "rc": 0, "start": "2024-09-20 13:32:10.690887" } 24160 1726853530.74696: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853530.74700: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853530.74702: variable 'omit' from source: magic vars 24160 1726853530.74779: variable 'ansible_distribution_major_version' from source: facts 24160 1726853530.74784: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853530.75002: variable 'type' from source: set_fact 24160 1726853530.75005: variable 'state' from source: include params 24160 1726853530.75008: variable 'interface' from source: set_fact 24160 1726853530.75012: variable 'current_interfaces' from source: set_fact 24160 1726853530.75018: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 24160 1726853530.75021: variable 'omit' from source: magic vars 24160 1726853530.75036: variable 'omit' from source: magic vars 24160 1726853530.75065: variable 'item' from source: unknown 24160 1726853530.75109: variable 'item' from source: unknown 24160 1726853530.75120: variable 'omit' from source: magic vars 24160 1726853530.75139: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24160 1726853530.75146: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853530.75149: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853530.75164: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24160 1726853530.75167: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853530.75169: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853530.75224: Set connection var ansible_shell_executable to /bin/sh 24160 1726853530.75228: Set connection var ansible_pipelining to False 24160 1726853530.75230: Set connection var ansible_connection to ssh 24160 1726853530.75233: Set connection var ansible_shell_type to sh 24160 1726853530.75239: Set connection var ansible_module_compression to ZIP_DEFLATED 24160 1726853530.75246: Set connection var ansible_timeout to 10 24160 1726853530.75265: variable 'ansible_shell_executable' from source: unknown 24160 1726853530.75268: variable 'ansible_connection' from source: unknown 24160 1726853530.75272: variable 'ansible_module_compression' from source: unknown 24160 1726853530.75275: variable 'ansible_shell_type' from source: unknown 24160 1726853530.75277: variable 'ansible_shell_executable' from source: unknown 24160 1726853530.75280: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853530.75283: variable 'ansible_pipelining' from source: unknown 24160 1726853530.75285: variable 'ansible_timeout' from source: unknown 24160 1726853530.75290: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853530.75349: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 24160 1726853530.75355: variable 'omit' from source: magic vars 24160 1726853530.75364: starting attempt loop 24160 1726853530.75367: running the handler 24160 1726853530.75373: _low_level_execute_command(): starting 24160 1726853530.75375: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24160 1726853530.75775: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853530.75791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24160 1726853530.75796: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853530.75811: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853530.75863: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853530.75869: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853530.75908: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853530.77494: stdout chunk (state=3): >>>/root <<< 24160 1726853530.77592: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853530.77621: stderr chunk (state=3): >>><<< 24160 1726853530.77623: stdout chunk (state=3): >>><<< 24160 1726853530.77633: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853530.77676: _low_level_execute_command(): starting 24160 1726853530.77680: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853530.7763586-24487-49420825444845 `" && echo ansible-tmp-1726853530.7763586-24487-49420825444845="` echo /root/.ansible/tmp/ansible-tmp-1726853530.7763586-24487-49420825444845 `" ) && sleep 0' 24160 1726853530.78036: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853530.78070: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24160 1726853530.78076: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 24160 1726853530.78078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853530.78081: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853530.78083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853530.78132: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853530.78135: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853530.78174: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853530.80124: stdout chunk (state=3): >>>ansible-tmp-1726853530.7763586-24487-49420825444845=/root/.ansible/tmp/ansible-tmp-1726853530.7763586-24487-49420825444845 <<< 24160 1726853530.80259: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853530.80288: stderr chunk (state=3): >>><<< 24160 1726853530.80291: stdout chunk (state=3): >>><<< 24160 1726853530.80380: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853530.7763586-24487-49420825444845=/root/.ansible/tmp/ansible-tmp-1726853530.7763586-24487-49420825444845 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853530.80384: variable 'ansible_module_compression' from source: unknown 24160 1726853530.80386: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24160jdl187cr/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 24160 1726853530.80397: variable 'ansible_facts' from source: unknown 24160 1726853530.80478: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853530.7763586-24487-49420825444845/AnsiballZ_command.py 24160 1726853530.80611: Sending initial data 24160 1726853530.80619: Sent initial data (155 bytes) 24160 1726853530.81224: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853530.81259: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853530.81278: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24160 1726853530.81369: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853530.81384: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853530.81449: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853530.82968: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 24160 1726853530.82993: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 24160 1726853530.83006: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24160 1726853530.83090: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24160 1726853530.83167: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24160jdl187cr/tmpwn225bou /root/.ansible/tmp/ansible-tmp-1726853530.7763586-24487-49420825444845/AnsiballZ_command.py <<< 24160 1726853530.83172: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853530.7763586-24487-49420825444845/AnsiballZ_command.py" <<< 24160 1726853530.83207: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24160jdl187cr/tmpwn225bou" to remote "/root/.ansible/tmp/ansible-tmp-1726853530.7763586-24487-49420825444845/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853530.7763586-24487-49420825444845/AnsiballZ_command.py" <<< 24160 1726853530.84032: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853530.84035: stdout chunk (state=3): >>><<< 24160 1726853530.84041: stderr chunk (state=3): >>><<< 24160 1726853530.84106: done transferring module to remote 24160 1726853530.84137: _low_level_execute_command(): starting 24160 1726853530.84140: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853530.7763586-24487-49420825444845/ /root/.ansible/tmp/ansible-tmp-1726853530.7763586-24487-49420825444845/AnsiballZ_command.py && sleep 0' 24160 1726853530.84790: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853530.84803: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853530.84818: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24160 1726853530.84887: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853530.84917: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853530.84934: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853530.84956: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853530.85029: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853530.86831: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853530.86835: stdout chunk (state=3): >>><<< 24160 1726853530.86837: stderr chunk (state=3): >>><<< 24160 1726853530.86854: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853530.86953: _low_level_execute_command(): starting 24160 1726853530.86956: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853530.7763586-24487-49420825444845/AnsiballZ_command.py && sleep 0' 24160 1726853530.87514: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 24160 1726853530.87517: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853530.87707: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853530.87710: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853530.87787: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853530.87854: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853531.03320: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "ethtest0", "up"], "start": "2024-09-20 13:32:11.027604", "end": "2024-09-20 13:32:11.031395", "delta": "0:00:00.003791", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set ethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 24160 1726853531.04824: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853531.04842: stderr chunk (state=3): >>>Shared connection to 10.31.45.153 closed. <<< 24160 1726853531.04850: stdout chunk (state=3): >>><<< 24160 1726853531.04857: stderr chunk (state=3): >>><<< 24160 1726853531.04878: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "ethtest0", "up"], "start": "2024-09-20 13:32:11.027604", "end": "2024-09-20 13:32:11.031395", "delta": "0:00:00.003791", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set ethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 24160 1726853531.04912: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set ethtest0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853530.7763586-24487-49420825444845/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24160 1726853531.04921: _low_level_execute_command(): starting 24160 1726853531.04928: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853530.7763586-24487-49420825444845/ > /dev/null 2>&1 && sleep 0' 24160 1726853531.05462: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853531.05481: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853531.05494: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853531.05508: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24160 1726853531.05521: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 24160 1726853531.05529: stderr chunk (state=3): >>>debug2: match not found <<< 24160 1726853531.05539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853531.05555: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24160 1726853531.05565: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address <<< 24160 1726853531.05649: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24160 1726853531.05652: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853531.05673: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853531.05687: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853531.05745: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853531.07573: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853531.07595: stderr chunk (state=3): >>><<< 24160 1726853531.07605: stdout chunk (state=3): >>><<< 24160 1726853531.07677: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853531.07684: handler run complete 24160 1726853531.07686: Evaluated conditional (False): False 24160 1726853531.07688: attempt loop complete, returning result 24160 1726853531.07690: variable 'item' from source: unknown 24160 1726853531.07720: variable 'item' from source: unknown ok: [managed_node1] => (item=ip link set ethtest0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "ethtest0", "up" ], "delta": "0:00:00.003791", "end": "2024-09-20 13:32:11.031395", "item": "ip link set ethtest0 up", "rc": 0, "start": "2024-09-20 13:32:11.027604" } 24160 1726853531.07835: dumping result to json 24160 1726853531.07838: done dumping result, returning 24160 1726853531.07840: done running TaskExecutor() for managed_node1/TASK: Create veth interface ethtest0 [02083763-bbaf-5676-4eb4-0000000001b2] 24160 1726853531.07842: sending task result for task 02083763-bbaf-5676-4eb4-0000000001b2 24160 1726853531.07945: done sending task result for task 02083763-bbaf-5676-4eb4-0000000001b2 24160 1726853531.07948: WORKER PROCESS EXITING 24160 1726853531.08010: no more pending results, returning what we have 24160 1726853531.08014: results queue empty 24160 1726853531.08015: checking for any_errors_fatal 24160 1726853531.08018: done checking for any_errors_fatal 24160 1726853531.08019: checking for max_fail_percentage 24160 1726853531.08020: done checking for max_fail_percentage 24160 1726853531.08021: checking to see if all hosts have failed and the running result is not ok 24160 1726853531.08022: done checking to see if all hosts have failed 24160 1726853531.08023: getting the remaining hosts for this loop 24160 1726853531.08024: done getting the remaining hosts for this loop 24160 1726853531.08026: getting the next task for host managed_node1 24160 1726853531.08031: done getting next task for host managed_node1 24160 1726853531.08033: ^ task is: TASK: Set up veth as managed by NetworkManager 24160 1726853531.08036: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853531.08039: getting variables 24160 1726853531.08040: in VariableManager get_vars() 24160 1726853531.08074: Calling all_inventory to load vars for managed_node1 24160 1726853531.08077: Calling groups_inventory to load vars for managed_node1 24160 1726853531.08079: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853531.08088: Calling all_plugins_play to load vars for managed_node1 24160 1726853531.08090: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853531.08093: Calling groups_plugins_play to load vars for managed_node1 24160 1726853531.08207: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853531.08321: done with get_vars() 24160 1726853531.08328: done getting variables 24160 1726853531.08370: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Friday 20 September 2024 13:32:11 -0400 (0:00:01.132) 0:00:07.486 ****** 24160 1726853531.08393: entering _queue_task() for managed_node1/command 24160 1726853531.08596: worker is 1 (out of 1 available) 24160 1726853531.08608: exiting _queue_task() for managed_node1/command 24160 1726853531.08620: done queuing things up, now waiting for results queue to drain 24160 1726853531.08621: waiting for pending results... 24160 1726853531.08774: running TaskExecutor() for managed_node1/TASK: Set up veth as managed by NetworkManager 24160 1726853531.08830: in run() - task 02083763-bbaf-5676-4eb4-0000000001b3 24160 1726853531.08843: variable 'ansible_search_path' from source: unknown 24160 1726853531.08848: variable 'ansible_search_path' from source: unknown 24160 1726853531.08879: calling self._execute() 24160 1726853531.08937: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853531.08941: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853531.08950: variable 'omit' from source: magic vars 24160 1726853531.09378: variable 'ansible_distribution_major_version' from source: facts 24160 1726853531.09382: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853531.09412: variable 'type' from source: set_fact 24160 1726853531.09421: variable 'state' from source: include params 24160 1726853531.09428: Evaluated conditional (type == 'veth' and state == 'present'): True 24160 1726853531.09439: variable 'omit' from source: magic vars 24160 1726853531.09477: variable 'omit' from source: magic vars 24160 1726853531.09578: variable 'interface' from source: set_fact 24160 1726853531.09603: variable 'omit' from source: magic vars 24160 1726853531.09644: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24160 1726853531.09686: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24160 1726853531.09712: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24160 1726853531.09734: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853531.09751: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853531.09787: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24160 1726853531.09796: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853531.09804: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853531.10176: Set connection var ansible_shell_executable to /bin/sh 24160 1726853531.10179: Set connection var ansible_pipelining to False 24160 1726853531.10182: Set connection var ansible_connection to ssh 24160 1726853531.10184: Set connection var ansible_shell_type to sh 24160 1726853531.10186: Set connection var ansible_module_compression to ZIP_DEFLATED 24160 1726853531.10189: Set connection var ansible_timeout to 10 24160 1726853531.10191: variable 'ansible_shell_executable' from source: unknown 24160 1726853531.10192: variable 'ansible_connection' from source: unknown 24160 1726853531.10194: variable 'ansible_module_compression' from source: unknown 24160 1726853531.10196: variable 'ansible_shell_type' from source: unknown 24160 1726853531.10198: variable 'ansible_shell_executable' from source: unknown 24160 1726853531.10200: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853531.10202: variable 'ansible_pipelining' from source: unknown 24160 1726853531.10204: variable 'ansible_timeout' from source: unknown 24160 1726853531.10206: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853531.10283: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 24160 1726853531.10299: variable 'omit' from source: magic vars 24160 1726853531.10309: starting attempt loop 24160 1726853531.10317: running the handler 24160 1726853531.10335: _low_level_execute_command(): starting 24160 1726853531.10347: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24160 1726853531.11398: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853531.11433: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24160 1726853531.11447: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 24160 1726853531.11462: stderr chunk (state=3): >>>debug2: match found <<< 24160 1726853531.11510: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853531.11583: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853531.11601: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853531.11625: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853531.11709: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853531.13361: stdout chunk (state=3): >>>/root <<< 24160 1726853531.13505: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853531.13526: stdout chunk (state=3): >>><<< 24160 1726853531.13559: stderr chunk (state=3): >>><<< 24160 1726853531.13689: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853531.13693: _low_level_execute_command(): starting 24160 1726853531.13724: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853531.1365643-24562-18651027223933 `" && echo ansible-tmp-1726853531.1365643-24562-18651027223933="` echo /root/.ansible/tmp/ansible-tmp-1726853531.1365643-24562-18651027223933 `" ) && sleep 0' 24160 1726853531.14419: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853531.14423: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853531.14438: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853531.14441: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24160 1726853531.14444: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 24160 1726853531.14446: stderr chunk (state=3): >>>debug2: match not found <<< 24160 1726853531.14448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853531.14528: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24160 1726853531.14531: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address <<< 24160 1726853531.14533: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24160 1726853531.14535: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853531.14537: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853531.14539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24160 1726853531.14541: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 24160 1726853531.14543: stderr chunk (state=3): >>>debug2: match found <<< 24160 1726853531.14545: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853531.14578: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853531.14599: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853531.14611: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853531.14682: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853531.16554: stdout chunk (state=3): >>>ansible-tmp-1726853531.1365643-24562-18651027223933=/root/.ansible/tmp/ansible-tmp-1726853531.1365643-24562-18651027223933 <<< 24160 1726853531.16660: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853531.16688: stderr chunk (state=3): >>><<< 24160 1726853531.16691: stdout chunk (state=3): >>><<< 24160 1726853531.16706: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853531.1365643-24562-18651027223933=/root/.ansible/tmp/ansible-tmp-1726853531.1365643-24562-18651027223933 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853531.16733: variable 'ansible_module_compression' from source: unknown 24160 1726853531.16774: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24160jdl187cr/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 24160 1726853531.16804: variable 'ansible_facts' from source: unknown 24160 1726853531.16860: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853531.1365643-24562-18651027223933/AnsiballZ_command.py 24160 1726853531.16959: Sending initial data 24160 1726853531.16962: Sent initial data (155 bytes) 24160 1726853531.17387: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853531.17391: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853531.17394: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853531.17396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853531.17459: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853531.17499: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853531.17505: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853531.19027: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24160 1726853531.19064: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24160 1726853531.19101: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24160jdl187cr/tmp1vtvwi7s /root/.ansible/tmp/ansible-tmp-1726853531.1365643-24562-18651027223933/AnsiballZ_command.py <<< 24160 1726853531.19105: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853531.1365643-24562-18651027223933/AnsiballZ_command.py" <<< 24160 1726853531.19138: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24160jdl187cr/tmp1vtvwi7s" to remote "/root/.ansible/tmp/ansible-tmp-1726853531.1365643-24562-18651027223933/AnsiballZ_command.py" <<< 24160 1726853531.19141: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853531.1365643-24562-18651027223933/AnsiballZ_command.py" <<< 24160 1726853531.19661: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853531.19693: stderr chunk (state=3): >>><<< 24160 1726853531.19696: stdout chunk (state=3): >>><<< 24160 1726853531.19739: done transferring module to remote 24160 1726853531.19747: _low_level_execute_command(): starting 24160 1726853531.19751: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853531.1365643-24562-18651027223933/ /root/.ansible/tmp/ansible-tmp-1726853531.1365643-24562-18651027223933/AnsiballZ_command.py && sleep 0' 24160 1726853531.20198: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24160 1726853531.20203: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853531.20205: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853531.20207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853531.20255: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853531.20261: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853531.20304: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853531.22029: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853531.22042: stderr chunk (state=3): >>><<< 24160 1726853531.22045: stdout chunk (state=3): >>><<< 24160 1726853531.22064: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853531.22068: _low_level_execute_command(): starting 24160 1726853531.22074: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853531.1365643-24562-18651027223933/AnsiballZ_command.py && sleep 0' 24160 1726853531.22518: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853531.22522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 24160 1726853531.22524: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 24160 1726853531.22527: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853531.22529: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853531.22579: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853531.22582: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853531.22634: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853531.39518: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "ethtest0", "managed", "true"], "start": "2024-09-20 13:32:11.375697", "end": "2024-09-20 13:32:11.394237", "delta": "0:00:00.018540", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set ethtest0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 24160 1726853531.41031: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 24160 1726853531.41060: stderr chunk (state=3): >>><<< 24160 1726853531.41064: stdout chunk (state=3): >>><<< 24160 1726853531.41082: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "ethtest0", "managed", "true"], "start": "2024-09-20 13:32:11.375697", "end": "2024-09-20 13:32:11.394237", "delta": "0:00:00.018540", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set ethtest0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 24160 1726853531.41111: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli d set ethtest0 managed true', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853531.1365643-24562-18651027223933/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24160 1726853531.41119: _low_level_execute_command(): starting 24160 1726853531.41122: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853531.1365643-24562-18651027223933/ > /dev/null 2>&1 && sleep 0' 24160 1726853531.41594: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853531.41597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 24160 1726853531.41599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853531.41601: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853531.41603: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853531.41656: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853531.41660: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853531.41662: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853531.41708: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853531.43517: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853531.43547: stderr chunk (state=3): >>><<< 24160 1726853531.43550: stdout chunk (state=3): >>><<< 24160 1726853531.43566: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853531.43574: handler run complete 24160 1726853531.43590: Evaluated conditional (False): False 24160 1726853531.43599: attempt loop complete, returning result 24160 1726853531.43601: _execute() done 24160 1726853531.43604: dumping result to json 24160 1726853531.43610: done dumping result, returning 24160 1726853531.43617: done running TaskExecutor() for managed_node1/TASK: Set up veth as managed by NetworkManager [02083763-bbaf-5676-4eb4-0000000001b3] 24160 1726853531.43621: sending task result for task 02083763-bbaf-5676-4eb4-0000000001b3 24160 1726853531.43712: done sending task result for task 02083763-bbaf-5676-4eb4-0000000001b3 24160 1726853531.43714: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "nmcli", "d", "set", "ethtest0", "managed", "true" ], "delta": "0:00:00.018540", "end": "2024-09-20 13:32:11.394237", "rc": 0, "start": "2024-09-20 13:32:11.375697" } 24160 1726853531.43777: no more pending results, returning what we have 24160 1726853531.43780: results queue empty 24160 1726853531.43781: checking for any_errors_fatal 24160 1726853531.43795: done checking for any_errors_fatal 24160 1726853531.43795: checking for max_fail_percentage 24160 1726853531.43797: done checking for max_fail_percentage 24160 1726853531.43798: checking to see if all hosts have failed and the running result is not ok 24160 1726853531.43799: done checking to see if all hosts have failed 24160 1726853531.43799: getting the remaining hosts for this loop 24160 1726853531.43800: done getting the remaining hosts for this loop 24160 1726853531.43804: getting the next task for host managed_node1 24160 1726853531.43810: done getting next task for host managed_node1 24160 1726853531.43812: ^ task is: TASK: Delete veth interface {{ interface }} 24160 1726853531.43815: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853531.43819: getting variables 24160 1726853531.43820: in VariableManager get_vars() 24160 1726853531.43859: Calling all_inventory to load vars for managed_node1 24160 1726853531.43862: Calling groups_inventory to load vars for managed_node1 24160 1726853531.43864: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853531.43876: Calling all_plugins_play to load vars for managed_node1 24160 1726853531.43878: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853531.43881: Calling groups_plugins_play to load vars for managed_node1 24160 1726853531.44021: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853531.44159: done with get_vars() 24160 1726853531.44167: done getting variables 24160 1726853531.44212: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 24160 1726853531.44299: variable 'interface' from source: set_fact TASK [Delete veth interface ethtest0] ****************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Friday 20 September 2024 13:32:11 -0400 (0:00:00.359) 0:00:07.845 ****** 24160 1726853531.44322: entering _queue_task() for managed_node1/command 24160 1726853531.44525: worker is 1 (out of 1 available) 24160 1726853531.44540: exiting _queue_task() for managed_node1/command 24160 1726853531.44551: done queuing things up, now waiting for results queue to drain 24160 1726853531.44553: waiting for pending results... 24160 1726853531.44708: running TaskExecutor() for managed_node1/TASK: Delete veth interface ethtest0 24160 1726853531.44766: in run() - task 02083763-bbaf-5676-4eb4-0000000001b4 24160 1726853531.44785: variable 'ansible_search_path' from source: unknown 24160 1726853531.44788: variable 'ansible_search_path' from source: unknown 24160 1726853531.44813: calling self._execute() 24160 1726853531.44883: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853531.44893: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853531.44897: variable 'omit' from source: magic vars 24160 1726853531.45153: variable 'ansible_distribution_major_version' from source: facts 24160 1726853531.45166: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853531.45294: variable 'type' from source: set_fact 24160 1726853531.45297: variable 'state' from source: include params 24160 1726853531.45300: variable 'interface' from source: set_fact 24160 1726853531.45305: variable 'current_interfaces' from source: set_fact 24160 1726853531.45312: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): False 24160 1726853531.45316: when evaluation is False, skipping this task 24160 1726853531.45319: _execute() done 24160 1726853531.45321: dumping result to json 24160 1726853531.45324: done dumping result, returning 24160 1726853531.45331: done running TaskExecutor() for managed_node1/TASK: Delete veth interface ethtest0 [02083763-bbaf-5676-4eb4-0000000001b4] 24160 1726853531.45334: sending task result for task 02083763-bbaf-5676-4eb4-0000000001b4 24160 1726853531.45411: done sending task result for task 02083763-bbaf-5676-4eb4-0000000001b4 24160 1726853531.45413: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'veth' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 24160 1726853531.45491: no more pending results, returning what we have 24160 1726853531.45494: results queue empty 24160 1726853531.45495: checking for any_errors_fatal 24160 1726853531.45501: done checking for any_errors_fatal 24160 1726853531.45502: checking for max_fail_percentage 24160 1726853531.45503: done checking for max_fail_percentage 24160 1726853531.45504: checking to see if all hosts have failed and the running result is not ok 24160 1726853531.45504: done checking to see if all hosts have failed 24160 1726853531.45505: getting the remaining hosts for this loop 24160 1726853531.45506: done getting the remaining hosts for this loop 24160 1726853531.45509: getting the next task for host managed_node1 24160 1726853531.45514: done getting next task for host managed_node1 24160 1726853531.45516: ^ task is: TASK: Create dummy interface {{ interface }} 24160 1726853531.45519: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853531.45522: getting variables 24160 1726853531.45523: in VariableManager get_vars() 24160 1726853531.45554: Calling all_inventory to load vars for managed_node1 24160 1726853531.45557: Calling groups_inventory to load vars for managed_node1 24160 1726853531.45559: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853531.45567: Calling all_plugins_play to load vars for managed_node1 24160 1726853531.45569: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853531.45574: Calling groups_plugins_play to load vars for managed_node1 24160 1726853531.45680: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853531.45794: done with get_vars() 24160 1726853531.45802: done getting variables 24160 1726853531.45843: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 24160 1726853531.45922: variable 'interface' from source: set_fact TASK [Create dummy interface ethtest0] ***************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Friday 20 September 2024 13:32:11 -0400 (0:00:00.016) 0:00:07.862 ****** 24160 1726853531.45944: entering _queue_task() for managed_node1/command 24160 1726853531.46133: worker is 1 (out of 1 available) 24160 1726853531.46147: exiting _queue_task() for managed_node1/command 24160 1726853531.46157: done queuing things up, now waiting for results queue to drain 24160 1726853531.46159: waiting for pending results... 24160 1726853531.46309: running TaskExecutor() for managed_node1/TASK: Create dummy interface ethtest0 24160 1726853531.46367: in run() - task 02083763-bbaf-5676-4eb4-0000000001b5 24160 1726853531.46383: variable 'ansible_search_path' from source: unknown 24160 1726853531.46387: variable 'ansible_search_path' from source: unknown 24160 1726853531.46414: calling self._execute() 24160 1726853531.46475: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853531.46479: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853531.46486: variable 'omit' from source: magic vars 24160 1726853531.46733: variable 'ansible_distribution_major_version' from source: facts 24160 1726853531.46743: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853531.46873: variable 'type' from source: set_fact 24160 1726853531.46877: variable 'state' from source: include params 24160 1726853531.46880: variable 'interface' from source: set_fact 24160 1726853531.46943: variable 'current_interfaces' from source: set_fact 24160 1726853531.46947: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 24160 1726853531.46949: when evaluation is False, skipping this task 24160 1726853531.46951: _execute() done 24160 1726853531.46953: dumping result to json 24160 1726853531.46955: done dumping result, returning 24160 1726853531.46957: done running TaskExecutor() for managed_node1/TASK: Create dummy interface ethtest0 [02083763-bbaf-5676-4eb4-0000000001b5] 24160 1726853531.46958: sending task result for task 02083763-bbaf-5676-4eb4-0000000001b5 24160 1726853531.47019: done sending task result for task 02083763-bbaf-5676-4eb4-0000000001b5 24160 1726853531.47022: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 24160 1726853531.47079: no more pending results, returning what we have 24160 1726853531.47082: results queue empty 24160 1726853531.47083: checking for any_errors_fatal 24160 1726853531.47087: done checking for any_errors_fatal 24160 1726853531.47088: checking for max_fail_percentage 24160 1726853531.47089: done checking for max_fail_percentage 24160 1726853531.47090: checking to see if all hosts have failed and the running result is not ok 24160 1726853531.47091: done checking to see if all hosts have failed 24160 1726853531.47091: getting the remaining hosts for this loop 24160 1726853531.47092: done getting the remaining hosts for this loop 24160 1726853531.47095: getting the next task for host managed_node1 24160 1726853531.47100: done getting next task for host managed_node1 24160 1726853531.47102: ^ task is: TASK: Delete dummy interface {{ interface }} 24160 1726853531.47105: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853531.47108: getting variables 24160 1726853531.47110: in VariableManager get_vars() 24160 1726853531.47131: Calling all_inventory to load vars for managed_node1 24160 1726853531.47133: Calling groups_inventory to load vars for managed_node1 24160 1726853531.47134: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853531.47141: Calling all_plugins_play to load vars for managed_node1 24160 1726853531.47142: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853531.47144: Calling groups_plugins_play to load vars for managed_node1 24160 1726853531.47287: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853531.47400: done with get_vars() 24160 1726853531.47406: done getting variables 24160 1726853531.47448: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 24160 1726853531.47521: variable 'interface' from source: set_fact TASK [Delete dummy interface ethtest0] ***************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Friday 20 September 2024 13:32:11 -0400 (0:00:00.015) 0:00:07.878 ****** 24160 1726853531.47541: entering _queue_task() for managed_node1/command 24160 1726853531.47734: worker is 1 (out of 1 available) 24160 1726853531.47749: exiting _queue_task() for managed_node1/command 24160 1726853531.47759: done queuing things up, now waiting for results queue to drain 24160 1726853531.47761: waiting for pending results... 24160 1726853531.47910: running TaskExecutor() for managed_node1/TASK: Delete dummy interface ethtest0 24160 1726853531.47967: in run() - task 02083763-bbaf-5676-4eb4-0000000001b6 24160 1726853531.47980: variable 'ansible_search_path' from source: unknown 24160 1726853531.47985: variable 'ansible_search_path' from source: unknown 24160 1726853531.48012: calling self._execute() 24160 1726853531.48074: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853531.48077: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853531.48084: variable 'omit' from source: magic vars 24160 1726853531.48333: variable 'ansible_distribution_major_version' from source: facts 24160 1726853531.48343: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853531.48468: variable 'type' from source: set_fact 24160 1726853531.48473: variable 'state' from source: include params 24160 1726853531.48476: variable 'interface' from source: set_fact 24160 1726853531.48481: variable 'current_interfaces' from source: set_fact 24160 1726853531.48488: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 24160 1726853531.48491: when evaluation is False, skipping this task 24160 1726853531.48493: _execute() done 24160 1726853531.48495: dumping result to json 24160 1726853531.48498: done dumping result, returning 24160 1726853531.48504: done running TaskExecutor() for managed_node1/TASK: Delete dummy interface ethtest0 [02083763-bbaf-5676-4eb4-0000000001b6] 24160 1726853531.48509: sending task result for task 02083763-bbaf-5676-4eb4-0000000001b6 24160 1726853531.48588: done sending task result for task 02083763-bbaf-5676-4eb4-0000000001b6 24160 1726853531.48591: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 24160 1726853531.48640: no more pending results, returning what we have 24160 1726853531.48643: results queue empty 24160 1726853531.48644: checking for any_errors_fatal 24160 1726853531.48649: done checking for any_errors_fatal 24160 1726853531.48650: checking for max_fail_percentage 24160 1726853531.48651: done checking for max_fail_percentage 24160 1726853531.48652: checking to see if all hosts have failed and the running result is not ok 24160 1726853531.48653: done checking to see if all hosts have failed 24160 1726853531.48653: getting the remaining hosts for this loop 24160 1726853531.48655: done getting the remaining hosts for this loop 24160 1726853531.48658: getting the next task for host managed_node1 24160 1726853531.48663: done getting next task for host managed_node1 24160 1726853531.48666: ^ task is: TASK: Create tap interface {{ interface }} 24160 1726853531.48668: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853531.48673: getting variables 24160 1726853531.48675: in VariableManager get_vars() 24160 1726853531.48704: Calling all_inventory to load vars for managed_node1 24160 1726853531.48707: Calling groups_inventory to load vars for managed_node1 24160 1726853531.48709: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853531.48717: Calling all_plugins_play to load vars for managed_node1 24160 1726853531.48719: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853531.48721: Calling groups_plugins_play to load vars for managed_node1 24160 1726853531.48831: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853531.48946: done with get_vars() 24160 1726853531.48953: done getting variables 24160 1726853531.48996: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 24160 1726853531.49072: variable 'interface' from source: set_fact TASK [Create tap interface ethtest0] ******************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Friday 20 September 2024 13:32:11 -0400 (0:00:00.015) 0:00:07.893 ****** 24160 1726853531.49093: entering _queue_task() for managed_node1/command 24160 1726853531.49283: worker is 1 (out of 1 available) 24160 1726853531.49299: exiting _queue_task() for managed_node1/command 24160 1726853531.49310: done queuing things up, now waiting for results queue to drain 24160 1726853531.49312: waiting for pending results... 24160 1726853531.49456: running TaskExecutor() for managed_node1/TASK: Create tap interface ethtest0 24160 1726853531.49520: in run() - task 02083763-bbaf-5676-4eb4-0000000001b7 24160 1726853531.49530: variable 'ansible_search_path' from source: unknown 24160 1726853531.49535: variable 'ansible_search_path' from source: unknown 24160 1726853531.49566: calling self._execute() 24160 1726853531.49626: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853531.49629: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853531.49638: variable 'omit' from source: magic vars 24160 1726853531.49982: variable 'ansible_distribution_major_version' from source: facts 24160 1726853531.49986: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853531.50033: variable 'type' from source: set_fact 24160 1726853531.50036: variable 'state' from source: include params 24160 1726853531.50039: variable 'interface' from source: set_fact 24160 1726853531.50044: variable 'current_interfaces' from source: set_fact 24160 1726853531.50051: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 24160 1726853531.50054: when evaluation is False, skipping this task 24160 1726853531.50060: _execute() done 24160 1726853531.50063: dumping result to json 24160 1726853531.50066: done dumping result, returning 24160 1726853531.50072: done running TaskExecutor() for managed_node1/TASK: Create tap interface ethtest0 [02083763-bbaf-5676-4eb4-0000000001b7] 24160 1726853531.50078: sending task result for task 02083763-bbaf-5676-4eb4-0000000001b7 24160 1726853531.50156: done sending task result for task 02083763-bbaf-5676-4eb4-0000000001b7 24160 1726853531.50159: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 24160 1726853531.50244: no more pending results, returning what we have 24160 1726853531.50247: results queue empty 24160 1726853531.50247: checking for any_errors_fatal 24160 1726853531.50251: done checking for any_errors_fatal 24160 1726853531.50252: checking for max_fail_percentage 24160 1726853531.50254: done checking for max_fail_percentage 24160 1726853531.50254: checking to see if all hosts have failed and the running result is not ok 24160 1726853531.50255: done checking to see if all hosts have failed 24160 1726853531.50256: getting the remaining hosts for this loop 24160 1726853531.50257: done getting the remaining hosts for this loop 24160 1726853531.50260: getting the next task for host managed_node1 24160 1726853531.50264: done getting next task for host managed_node1 24160 1726853531.50267: ^ task is: TASK: Delete tap interface {{ interface }} 24160 1726853531.50269: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853531.50275: getting variables 24160 1726853531.50276: in VariableManager get_vars() 24160 1726853531.50305: Calling all_inventory to load vars for managed_node1 24160 1726853531.50307: Calling groups_inventory to load vars for managed_node1 24160 1726853531.50309: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853531.50315: Calling all_plugins_play to load vars for managed_node1 24160 1726853531.50317: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853531.50319: Calling groups_plugins_play to load vars for managed_node1 24160 1726853531.50459: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853531.50599: done with get_vars() 24160 1726853531.50605: done getting variables 24160 1726853531.50648: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 24160 1726853531.50742: variable 'interface' from source: set_fact TASK [Delete tap interface ethtest0] ******************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Friday 20 September 2024 13:32:11 -0400 (0:00:00.016) 0:00:07.910 ****** 24160 1726853531.50764: entering _queue_task() for managed_node1/command 24160 1726853531.50947: worker is 1 (out of 1 available) 24160 1726853531.50962: exiting _queue_task() for managed_node1/command 24160 1726853531.50975: done queuing things up, now waiting for results queue to drain 24160 1726853531.50978: waiting for pending results... 24160 1726853531.51120: running TaskExecutor() for managed_node1/TASK: Delete tap interface ethtest0 24160 1726853531.51176: in run() - task 02083763-bbaf-5676-4eb4-0000000001b8 24160 1726853531.51188: variable 'ansible_search_path' from source: unknown 24160 1726853531.51192: variable 'ansible_search_path' from source: unknown 24160 1726853531.51220: calling self._execute() 24160 1726853531.51283: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853531.51287: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853531.51295: variable 'omit' from source: magic vars 24160 1726853531.51547: variable 'ansible_distribution_major_version' from source: facts 24160 1726853531.51559: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853531.51688: variable 'type' from source: set_fact 24160 1726853531.51692: variable 'state' from source: include params 24160 1726853531.51695: variable 'interface' from source: set_fact 24160 1726853531.51698: variable 'current_interfaces' from source: set_fact 24160 1726853531.51705: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 24160 1726853531.51708: when evaluation is False, skipping this task 24160 1726853531.51711: _execute() done 24160 1726853531.51713: dumping result to json 24160 1726853531.51716: done dumping result, returning 24160 1726853531.51722: done running TaskExecutor() for managed_node1/TASK: Delete tap interface ethtest0 [02083763-bbaf-5676-4eb4-0000000001b8] 24160 1726853531.51726: sending task result for task 02083763-bbaf-5676-4eb4-0000000001b8 24160 1726853531.51807: done sending task result for task 02083763-bbaf-5676-4eb4-0000000001b8 24160 1726853531.51810: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 24160 1726853531.51894: no more pending results, returning what we have 24160 1726853531.51897: results queue empty 24160 1726853531.51898: checking for any_errors_fatal 24160 1726853531.51902: done checking for any_errors_fatal 24160 1726853531.51903: checking for max_fail_percentage 24160 1726853531.51905: done checking for max_fail_percentage 24160 1726853531.51905: checking to see if all hosts have failed and the running result is not ok 24160 1726853531.51906: done checking to see if all hosts have failed 24160 1726853531.51907: getting the remaining hosts for this loop 24160 1726853531.51908: done getting the remaining hosts for this loop 24160 1726853531.51911: getting the next task for host managed_node1 24160 1726853531.51917: done getting next task for host managed_node1 24160 1726853531.51920: ^ task is: TASK: Include the task 'assert_device_present.yml' 24160 1726853531.51922: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853531.51925: getting variables 24160 1726853531.51926: in VariableManager get_vars() 24160 1726853531.51950: Calling all_inventory to load vars for managed_node1 24160 1726853531.51952: Calling groups_inventory to load vars for managed_node1 24160 1726853531.51954: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853531.51961: Calling all_plugins_play to load vars for managed_node1 24160 1726853531.51962: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853531.51964: Calling groups_plugins_play to load vars for managed_node1 24160 1726853531.52076: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853531.52191: done with get_vars() 24160 1726853531.52199: done getting variables TASK [Include the task 'assert_device_present.yml'] **************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:20 Friday 20 September 2024 13:32:11 -0400 (0:00:00.014) 0:00:07.925 ****** 24160 1726853531.52261: entering _queue_task() for managed_node1/include_tasks 24160 1726853531.52443: worker is 1 (out of 1 available) 24160 1726853531.52457: exiting _queue_task() for managed_node1/include_tasks 24160 1726853531.52473: done queuing things up, now waiting for results queue to drain 24160 1726853531.52475: waiting for pending results... 24160 1726853531.52615: running TaskExecutor() for managed_node1/TASK: Include the task 'assert_device_present.yml' 24160 1726853531.52778: in run() - task 02083763-bbaf-5676-4eb4-00000000000e 24160 1726853531.52781: variable 'ansible_search_path' from source: unknown 24160 1726853531.52785: calling self._execute() 24160 1726853531.52817: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853531.52828: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853531.52841: variable 'omit' from source: magic vars 24160 1726853531.53187: variable 'ansible_distribution_major_version' from source: facts 24160 1726853531.53204: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853531.53217: _execute() done 24160 1726853531.53224: dumping result to json 24160 1726853531.53232: done dumping result, returning 24160 1726853531.53242: done running TaskExecutor() for managed_node1/TASK: Include the task 'assert_device_present.yml' [02083763-bbaf-5676-4eb4-00000000000e] 24160 1726853531.53251: sending task result for task 02083763-bbaf-5676-4eb4-00000000000e 24160 1726853531.53382: done sending task result for task 02083763-bbaf-5676-4eb4-00000000000e 24160 1726853531.53385: WORKER PROCESS EXITING 24160 1726853531.53448: no more pending results, returning what we have 24160 1726853531.53456: in VariableManager get_vars() 24160 1726853531.53499: Calling all_inventory to load vars for managed_node1 24160 1726853531.53503: Calling groups_inventory to load vars for managed_node1 24160 1726853531.53506: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853531.53518: Calling all_plugins_play to load vars for managed_node1 24160 1726853531.53522: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853531.53525: Calling groups_plugins_play to load vars for managed_node1 24160 1726853531.54037: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853531.54142: done with get_vars() 24160 1726853531.54147: variable 'ansible_search_path' from source: unknown 24160 1726853531.54155: we have included files to process 24160 1726853531.54156: generating all_blocks data 24160 1726853531.54157: done generating all_blocks data 24160 1726853531.54159: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 24160 1726853531.54159: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 24160 1726853531.54161: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 24160 1726853531.54259: in VariableManager get_vars() 24160 1726853531.54273: done with get_vars() 24160 1726853531.54343: done processing included file 24160 1726853531.54345: iterating over new_blocks loaded from include file 24160 1726853531.54346: in VariableManager get_vars() 24160 1726853531.54354: done with get_vars() 24160 1726853531.54356: filtering new block on tags 24160 1726853531.54366: done filtering new block on tags 24160 1726853531.54367: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed_node1 24160 1726853531.54374: extending task lists for all hosts with included blocks 24160 1726853531.55228: done extending task lists 24160 1726853531.55229: done processing included files 24160 1726853531.55229: results queue empty 24160 1726853531.55230: checking for any_errors_fatal 24160 1726853531.55231: done checking for any_errors_fatal 24160 1726853531.55232: checking for max_fail_percentage 24160 1726853531.55232: done checking for max_fail_percentage 24160 1726853531.55233: checking to see if all hosts have failed and the running result is not ok 24160 1726853531.55234: done checking to see if all hosts have failed 24160 1726853531.55234: getting the remaining hosts for this loop 24160 1726853531.55235: done getting the remaining hosts for this loop 24160 1726853531.55236: getting the next task for host managed_node1 24160 1726853531.55239: done getting next task for host managed_node1 24160 1726853531.55240: ^ task is: TASK: Include the task 'get_interface_stat.yml' 24160 1726853531.55242: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853531.55243: getting variables 24160 1726853531.55244: in VariableManager get_vars() 24160 1726853531.55251: Calling all_inventory to load vars for managed_node1 24160 1726853531.55252: Calling groups_inventory to load vars for managed_node1 24160 1726853531.55255: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853531.55259: Calling all_plugins_play to load vars for managed_node1 24160 1726853531.55261: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853531.55262: Calling groups_plugins_play to load vars for managed_node1 24160 1726853531.55341: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853531.55447: done with get_vars() 24160 1726853531.55455: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 13:32:11 -0400 (0:00:00.032) 0:00:07.957 ****** 24160 1726853531.55507: entering _queue_task() for managed_node1/include_tasks 24160 1726853531.55703: worker is 1 (out of 1 available) 24160 1726853531.55718: exiting _queue_task() for managed_node1/include_tasks 24160 1726853531.55731: done queuing things up, now waiting for results queue to drain 24160 1726853531.55732: waiting for pending results... 24160 1726853531.56089: running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' 24160 1726853531.56093: in run() - task 02083763-bbaf-5676-4eb4-0000000002bc 24160 1726853531.56096: variable 'ansible_search_path' from source: unknown 24160 1726853531.56099: variable 'ansible_search_path' from source: unknown 24160 1726853531.56101: calling self._execute() 24160 1726853531.56154: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853531.56166: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853531.56183: variable 'omit' from source: magic vars 24160 1726853531.56538: variable 'ansible_distribution_major_version' from source: facts 24160 1726853531.56556: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853531.56567: _execute() done 24160 1726853531.56579: dumping result to json 24160 1726853531.56587: done dumping result, returning 24160 1726853531.56599: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' [02083763-bbaf-5676-4eb4-0000000002bc] 24160 1726853531.56608: sending task result for task 02083763-bbaf-5676-4eb4-0000000002bc 24160 1726853531.56734: no more pending results, returning what we have 24160 1726853531.56739: in VariableManager get_vars() 24160 1726853531.56781: Calling all_inventory to load vars for managed_node1 24160 1726853531.56784: Calling groups_inventory to load vars for managed_node1 24160 1726853531.56787: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853531.56801: Calling all_plugins_play to load vars for managed_node1 24160 1726853531.56804: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853531.56807: Calling groups_plugins_play to load vars for managed_node1 24160 1726853531.57105: done sending task result for task 02083763-bbaf-5676-4eb4-0000000002bc 24160 1726853531.57109: WORKER PROCESS EXITING 24160 1726853531.57123: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853531.57315: done with get_vars() 24160 1726853531.57323: variable 'ansible_search_path' from source: unknown 24160 1726853531.57324: variable 'ansible_search_path' from source: unknown 24160 1726853531.57357: we have included files to process 24160 1726853531.57358: generating all_blocks data 24160 1726853531.57359: done generating all_blocks data 24160 1726853531.57361: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 24160 1726853531.57362: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 24160 1726853531.57364: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 24160 1726853531.57570: done processing included file 24160 1726853531.57574: iterating over new_blocks loaded from include file 24160 1726853531.57575: in VariableManager get_vars() 24160 1726853531.57590: done with get_vars() 24160 1726853531.57592: filtering new block on tags 24160 1726853531.57606: done filtering new block on tags 24160 1726853531.57608: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node1 24160 1726853531.57612: extending task lists for all hosts with included blocks 24160 1726853531.57717: done extending task lists 24160 1726853531.57719: done processing included files 24160 1726853531.57719: results queue empty 24160 1726853531.57720: checking for any_errors_fatal 24160 1726853531.57723: done checking for any_errors_fatal 24160 1726853531.57724: checking for max_fail_percentage 24160 1726853531.57725: done checking for max_fail_percentage 24160 1726853531.57726: checking to see if all hosts have failed and the running result is not ok 24160 1726853531.57727: done checking to see if all hosts have failed 24160 1726853531.57727: getting the remaining hosts for this loop 24160 1726853531.57728: done getting the remaining hosts for this loop 24160 1726853531.57731: getting the next task for host managed_node1 24160 1726853531.57735: done getting next task for host managed_node1 24160 1726853531.57737: ^ task is: TASK: Get stat for interface {{ interface }} 24160 1726853531.57740: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853531.57742: getting variables 24160 1726853531.57743: in VariableManager get_vars() 24160 1726853531.57754: Calling all_inventory to load vars for managed_node1 24160 1726853531.57756: Calling groups_inventory to load vars for managed_node1 24160 1726853531.57758: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853531.57763: Calling all_plugins_play to load vars for managed_node1 24160 1726853531.57765: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853531.57768: Calling groups_plugins_play to load vars for managed_node1 24160 1726853531.57906: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853531.58098: done with get_vars() 24160 1726853531.58106: done getting variables 24160 1726853531.58252: variable 'interface' from source: set_fact TASK [Get stat for interface ethtest0] ***************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 13:32:11 -0400 (0:00:00.027) 0:00:07.985 ****** 24160 1726853531.58284: entering _queue_task() for managed_node1/stat 24160 1726853531.58526: worker is 1 (out of 1 available) 24160 1726853531.58538: exiting _queue_task() for managed_node1/stat 24160 1726853531.58549: done queuing things up, now waiting for results queue to drain 24160 1726853531.58550: waiting for pending results... 24160 1726853531.58988: running TaskExecutor() for managed_node1/TASK: Get stat for interface ethtest0 24160 1726853531.58993: in run() - task 02083763-bbaf-5676-4eb4-000000000373 24160 1726853531.58996: variable 'ansible_search_path' from source: unknown 24160 1726853531.58998: variable 'ansible_search_path' from source: unknown 24160 1726853531.59001: calling self._execute() 24160 1726853531.59048: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853531.59059: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853531.59073: variable 'omit' from source: magic vars 24160 1726853531.59493: variable 'ansible_distribution_major_version' from source: facts 24160 1726853531.59511: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853531.59522: variable 'omit' from source: magic vars 24160 1726853531.59573: variable 'omit' from source: magic vars 24160 1726853531.59674: variable 'interface' from source: set_fact 24160 1726853531.59696: variable 'omit' from source: magic vars 24160 1726853531.59738: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24160 1726853531.59785: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24160 1726853531.59809: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24160 1726853531.59831: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853531.59856: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853531.59898: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24160 1726853531.59906: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853531.59914: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853531.60020: Set connection var ansible_shell_executable to /bin/sh 24160 1726853531.60032: Set connection var ansible_pipelining to False 24160 1726853531.60039: Set connection var ansible_connection to ssh 24160 1726853531.60046: Set connection var ansible_shell_type to sh 24160 1726853531.60061: Set connection var ansible_module_compression to ZIP_DEFLATED 24160 1726853531.60092: Set connection var ansible_timeout to 10 24160 1726853531.60106: variable 'ansible_shell_executable' from source: unknown 24160 1726853531.60114: variable 'ansible_connection' from source: unknown 24160 1726853531.60202: variable 'ansible_module_compression' from source: unknown 24160 1726853531.60205: variable 'ansible_shell_type' from source: unknown 24160 1726853531.60207: variable 'ansible_shell_executable' from source: unknown 24160 1726853531.60209: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853531.60211: variable 'ansible_pipelining' from source: unknown 24160 1726853531.60213: variable 'ansible_timeout' from source: unknown 24160 1726853531.60215: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853531.60352: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 24160 1726853531.60366: variable 'omit' from source: magic vars 24160 1726853531.60377: starting attempt loop 24160 1726853531.60382: running the handler 24160 1726853531.60398: _low_level_execute_command(): starting 24160 1726853531.60407: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24160 1726853531.61201: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853531.61312: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853531.61328: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853531.61368: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853531.63058: stdout chunk (state=3): >>>/root <<< 24160 1726853531.63277: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853531.63281: stdout chunk (state=3): >>><<< 24160 1726853531.63284: stderr chunk (state=3): >>><<< 24160 1726853531.63286: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853531.63289: _low_level_execute_command(): starting 24160 1726853531.63292: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853531.632269-24597-230597124621575 `" && echo ansible-tmp-1726853531.632269-24597-230597124621575="` echo /root/.ansible/tmp/ansible-tmp-1726853531.632269-24597-230597124621575 `" ) && sleep 0' 24160 1726853531.63910: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853531.63924: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853531.64036: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853531.64061: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853531.64091: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853531.64110: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853531.64185: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853531.66319: stdout chunk (state=3): >>>ansible-tmp-1726853531.632269-24597-230597124621575=/root/.ansible/tmp/ansible-tmp-1726853531.632269-24597-230597124621575 <<< 24160 1726853531.66384: stdout chunk (state=3): >>><<< 24160 1726853531.66393: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853531.66401: stderr chunk (state=3): >>><<< 24160 1726853531.66431: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853531.632269-24597-230597124621575=/root/.ansible/tmp/ansible-tmp-1726853531.632269-24597-230597124621575 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853531.66488: variable 'ansible_module_compression' from source: unknown 24160 1726853531.66633: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24160jdl187cr/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 24160 1726853531.66636: variable 'ansible_facts' from source: unknown 24160 1726853531.66715: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853531.632269-24597-230597124621575/AnsiballZ_stat.py 24160 1726853531.66868: Sending initial data 24160 1726853531.66969: Sent initial data (152 bytes) 24160 1726853531.67559: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853531.67624: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853531.67684: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853531.67714: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853531.67786: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853531.69350: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24160 1726853531.69386: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24160 1726853531.69453: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24160jdl187cr/tmp0omwg1dp /root/.ansible/tmp/ansible-tmp-1726853531.632269-24597-230597124621575/AnsiballZ_stat.py <<< 24160 1726853531.69457: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853531.632269-24597-230597124621575/AnsiballZ_stat.py" <<< 24160 1726853531.69495: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24160jdl187cr/tmp0omwg1dp" to remote "/root/.ansible/tmp/ansible-tmp-1726853531.632269-24597-230597124621575/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853531.632269-24597-230597124621575/AnsiballZ_stat.py" <<< 24160 1726853531.70411: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853531.70415: stdout chunk (state=3): >>><<< 24160 1726853531.70417: stderr chunk (state=3): >>><<< 24160 1726853531.70419: done transferring module to remote 24160 1726853531.70422: _low_level_execute_command(): starting 24160 1726853531.70424: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853531.632269-24597-230597124621575/ /root/.ansible/tmp/ansible-tmp-1726853531.632269-24597-230597124621575/AnsiballZ_stat.py && sleep 0' 24160 1726853531.71091: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853531.71120: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853531.71134: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853531.71161: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853531.71222: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853531.73019: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853531.73036: stdout chunk (state=3): >>><<< 24160 1726853531.73047: stderr chunk (state=3): >>><<< 24160 1726853531.73070: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853531.73089: _low_level_execute_command(): starting 24160 1726853531.73102: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853531.632269-24597-230597124621575/AnsiballZ_stat.py && sleep 0' 24160 1726853531.73718: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853531.73734: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853531.73750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853531.73768: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24160 1726853531.73841: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853531.73888: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853531.73908: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853531.73944: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853531.74020: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853531.89141: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/ethtest0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 28928, "dev": 23, "nlink": 1, "atime": 1726853530.2984927, "mtime": 1726853530.2984927, "ctime": 1726853530.2984927, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/ethtest0", "lnk_target": "../../devices/virtual/net/ethtest0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 24160 1726853531.90408: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 24160 1726853531.90432: stderr chunk (state=3): >>><<< 24160 1726853531.90438: stdout chunk (state=3): >>><<< 24160 1726853531.90458: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/ethtest0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 28928, "dev": 23, "nlink": 1, "atime": 1726853530.2984927, "mtime": 1726853530.2984927, "ctime": 1726853530.2984927, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/ethtest0", "lnk_target": "../../devices/virtual/net/ethtest0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 24160 1726853531.90495: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853531.632269-24597-230597124621575/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24160 1726853531.90503: _low_level_execute_command(): starting 24160 1726853531.90508: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853531.632269-24597-230597124621575/ > /dev/null 2>&1 && sleep 0' 24160 1726853531.90930: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853531.90933: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 24160 1726853531.90936: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 24160 1726853531.90938: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853531.90940: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853531.90992: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853531.90999: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853531.91038: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853531.92864: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853531.92893: stderr chunk (state=3): >>><<< 24160 1726853531.92897: stdout chunk (state=3): >>><<< 24160 1726853531.92910: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853531.92916: handler run complete 24160 1726853531.92949: attempt loop complete, returning result 24160 1726853531.92952: _execute() done 24160 1726853531.92958: dumping result to json 24160 1726853531.92960: done dumping result, returning 24160 1726853531.92968: done running TaskExecutor() for managed_node1/TASK: Get stat for interface ethtest0 [02083763-bbaf-5676-4eb4-000000000373] 24160 1726853531.92970: sending task result for task 02083763-bbaf-5676-4eb4-000000000373 24160 1726853531.93078: done sending task result for task 02083763-bbaf-5676-4eb4-000000000373 24160 1726853531.93081: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "atime": 1726853530.2984927, "block_size": 4096, "blocks": 0, "ctime": 1726853530.2984927, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 28928, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/ethtest0", "lnk_target": "../../devices/virtual/net/ethtest0", "mode": "0777", "mtime": 1726853530.2984927, "nlink": 1, "path": "/sys/class/net/ethtest0", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 24160 1726853531.93164: no more pending results, returning what we have 24160 1726853531.93167: results queue empty 24160 1726853531.93168: checking for any_errors_fatal 24160 1726853531.93169: done checking for any_errors_fatal 24160 1726853531.93170: checking for max_fail_percentage 24160 1726853531.93173: done checking for max_fail_percentage 24160 1726853531.93174: checking to see if all hosts have failed and the running result is not ok 24160 1726853531.93175: done checking to see if all hosts have failed 24160 1726853531.93175: getting the remaining hosts for this loop 24160 1726853531.93177: done getting the remaining hosts for this loop 24160 1726853531.93180: getting the next task for host managed_node1 24160 1726853531.93187: done getting next task for host managed_node1 24160 1726853531.93190: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 24160 1726853531.93193: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853531.93197: getting variables 24160 1726853531.93198: in VariableManager get_vars() 24160 1726853531.93298: Calling all_inventory to load vars for managed_node1 24160 1726853531.93301: Calling groups_inventory to load vars for managed_node1 24160 1726853531.93303: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853531.93311: Calling all_plugins_play to load vars for managed_node1 24160 1726853531.93314: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853531.93316: Calling groups_plugins_play to load vars for managed_node1 24160 1726853531.93423: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853531.93543: done with get_vars() 24160 1726853531.93551: done getting variables 24160 1726853531.93626: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 24160 1726853531.93715: variable 'interface' from source: set_fact TASK [Assert that the interface is present - 'ethtest0'] *********************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 13:32:11 -0400 (0:00:00.354) 0:00:08.340 ****** 24160 1726853531.93737: entering _queue_task() for managed_node1/assert 24160 1726853531.93739: Creating lock for assert 24160 1726853531.93967: worker is 1 (out of 1 available) 24160 1726853531.93981: exiting _queue_task() for managed_node1/assert 24160 1726853531.93993: done queuing things up, now waiting for results queue to drain 24160 1726853531.93995: waiting for pending results... 24160 1726853531.94165: running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'ethtest0' 24160 1726853531.94227: in run() - task 02083763-bbaf-5676-4eb4-0000000002bd 24160 1726853531.94239: variable 'ansible_search_path' from source: unknown 24160 1726853531.94243: variable 'ansible_search_path' from source: unknown 24160 1726853531.94273: calling self._execute() 24160 1726853531.94344: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853531.94350: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853531.94359: variable 'omit' from source: magic vars 24160 1726853531.94617: variable 'ansible_distribution_major_version' from source: facts 24160 1726853531.94628: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853531.94638: variable 'omit' from source: magic vars 24160 1726853531.94661: variable 'omit' from source: magic vars 24160 1726853531.94728: variable 'interface' from source: set_fact 24160 1726853531.94747: variable 'omit' from source: magic vars 24160 1726853531.94778: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24160 1726853531.94805: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24160 1726853531.94820: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24160 1726853531.94833: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853531.94842: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853531.94870: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24160 1726853531.94876: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853531.94878: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853531.94945: Set connection var ansible_shell_executable to /bin/sh 24160 1726853531.94948: Set connection var ansible_pipelining to False 24160 1726853531.94950: Set connection var ansible_connection to ssh 24160 1726853531.94957: Set connection var ansible_shell_type to sh 24160 1726853531.94961: Set connection var ansible_module_compression to ZIP_DEFLATED 24160 1726853531.94974: Set connection var ansible_timeout to 10 24160 1726853531.94993: variable 'ansible_shell_executable' from source: unknown 24160 1726853531.94996: variable 'ansible_connection' from source: unknown 24160 1726853531.94999: variable 'ansible_module_compression' from source: unknown 24160 1726853531.95001: variable 'ansible_shell_type' from source: unknown 24160 1726853531.95003: variable 'ansible_shell_executable' from source: unknown 24160 1726853531.95006: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853531.95008: variable 'ansible_pipelining' from source: unknown 24160 1726853531.95010: variable 'ansible_timeout' from source: unknown 24160 1726853531.95015: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853531.95121: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 24160 1726853531.95129: variable 'omit' from source: magic vars 24160 1726853531.95135: starting attempt loop 24160 1726853531.95137: running the handler 24160 1726853531.95233: variable 'interface_stat' from source: set_fact 24160 1726853531.95247: Evaluated conditional (interface_stat.stat.exists): True 24160 1726853531.95252: handler run complete 24160 1726853531.95265: attempt loop complete, returning result 24160 1726853531.95268: _execute() done 24160 1726853531.95273: dumping result to json 24160 1726853531.95275: done dumping result, returning 24160 1726853531.95282: done running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'ethtest0' [02083763-bbaf-5676-4eb4-0000000002bd] 24160 1726853531.95286: sending task result for task 02083763-bbaf-5676-4eb4-0000000002bd 24160 1726853531.95364: done sending task result for task 02083763-bbaf-5676-4eb4-0000000002bd 24160 1726853531.95367: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 24160 1726853531.95453: no more pending results, returning what we have 24160 1726853531.95456: results queue empty 24160 1726853531.95457: checking for any_errors_fatal 24160 1726853531.95465: done checking for any_errors_fatal 24160 1726853531.95466: checking for max_fail_percentage 24160 1726853531.95467: done checking for max_fail_percentage 24160 1726853531.95468: checking to see if all hosts have failed and the running result is not ok 24160 1726853531.95468: done checking to see if all hosts have failed 24160 1726853531.95469: getting the remaining hosts for this loop 24160 1726853531.95472: done getting the remaining hosts for this loop 24160 1726853531.95476: getting the next task for host managed_node1 24160 1726853531.95482: done getting next task for host managed_node1 24160 1726853531.95484: ^ task is: TASK: Initialize the connection_failed flag 24160 1726853531.95485: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853531.95489: getting variables 24160 1726853531.95490: in VariableManager get_vars() 24160 1726853531.95522: Calling all_inventory to load vars for managed_node1 24160 1726853531.95525: Calling groups_inventory to load vars for managed_node1 24160 1726853531.95527: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853531.95535: Calling all_plugins_play to load vars for managed_node1 24160 1726853531.95537: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853531.95540: Calling groups_plugins_play to load vars for managed_node1 24160 1726853531.95665: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853531.95814: done with get_vars() 24160 1726853531.95821: done getting variables 24160 1726853531.95861: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize the connection_failed flag] *********************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:23 Friday 20 September 2024 13:32:11 -0400 (0:00:00.021) 0:00:08.361 ****** 24160 1726853531.95883: entering _queue_task() for managed_node1/set_fact 24160 1726853531.96078: worker is 1 (out of 1 available) 24160 1726853531.96090: exiting _queue_task() for managed_node1/set_fact 24160 1726853531.96101: done queuing things up, now waiting for results queue to drain 24160 1726853531.96102: waiting for pending results... 24160 1726853531.96252: running TaskExecutor() for managed_node1/TASK: Initialize the connection_failed flag 24160 1726853531.96305: in run() - task 02083763-bbaf-5676-4eb4-00000000000f 24160 1726853531.96317: variable 'ansible_search_path' from source: unknown 24160 1726853531.96346: calling self._execute() 24160 1726853531.96413: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853531.96417: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853531.96425: variable 'omit' from source: magic vars 24160 1726853531.96681: variable 'ansible_distribution_major_version' from source: facts 24160 1726853531.96690: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853531.96696: variable 'omit' from source: magic vars 24160 1726853531.96709: variable 'omit' from source: magic vars 24160 1726853531.96733: variable 'omit' from source: magic vars 24160 1726853531.96765: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24160 1726853531.96795: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24160 1726853531.96810: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24160 1726853531.96823: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853531.96832: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853531.96854: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24160 1726853531.96860: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853531.96872: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853531.96934: Set connection var ansible_shell_executable to /bin/sh 24160 1726853531.96937: Set connection var ansible_pipelining to False 24160 1726853531.96940: Set connection var ansible_connection to ssh 24160 1726853531.96942: Set connection var ansible_shell_type to sh 24160 1726853531.96949: Set connection var ansible_module_compression to ZIP_DEFLATED 24160 1726853531.96959: Set connection var ansible_timeout to 10 24160 1726853531.96976: variable 'ansible_shell_executable' from source: unknown 24160 1726853531.96979: variable 'ansible_connection' from source: unknown 24160 1726853531.96983: variable 'ansible_module_compression' from source: unknown 24160 1726853531.96985: variable 'ansible_shell_type' from source: unknown 24160 1726853531.96989: variable 'ansible_shell_executable' from source: unknown 24160 1726853531.96991: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853531.96993: variable 'ansible_pipelining' from source: unknown 24160 1726853531.96995: variable 'ansible_timeout' from source: unknown 24160 1726853531.96997: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853531.97100: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 24160 1726853531.97109: variable 'omit' from source: magic vars 24160 1726853531.97113: starting attempt loop 24160 1726853531.97115: running the handler 24160 1726853531.97127: handler run complete 24160 1726853531.97135: attempt loop complete, returning result 24160 1726853531.97138: _execute() done 24160 1726853531.97140: dumping result to json 24160 1726853531.97142: done dumping result, returning 24160 1726853531.97149: done running TaskExecutor() for managed_node1/TASK: Initialize the connection_failed flag [02083763-bbaf-5676-4eb4-00000000000f] 24160 1726853531.97151: sending task result for task 02083763-bbaf-5676-4eb4-00000000000f 24160 1726853531.97231: done sending task result for task 02083763-bbaf-5676-4eb4-00000000000f 24160 1726853531.97234: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "connection_failed": false }, "changed": false } 24160 1726853531.97286: no more pending results, returning what we have 24160 1726853531.97289: results queue empty 24160 1726853531.97290: checking for any_errors_fatal 24160 1726853531.97297: done checking for any_errors_fatal 24160 1726853531.97298: checking for max_fail_percentage 24160 1726853531.97299: done checking for max_fail_percentage 24160 1726853531.97300: checking to see if all hosts have failed and the running result is not ok 24160 1726853531.97301: done checking to see if all hosts have failed 24160 1726853531.97301: getting the remaining hosts for this loop 24160 1726853531.97303: done getting the remaining hosts for this loop 24160 1726853531.97306: getting the next task for host managed_node1 24160 1726853531.97311: done getting next task for host managed_node1 24160 1726853531.97316: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 24160 1726853531.97318: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853531.97331: getting variables 24160 1726853531.97332: in VariableManager get_vars() 24160 1726853531.97361: Calling all_inventory to load vars for managed_node1 24160 1726853531.97363: Calling groups_inventory to load vars for managed_node1 24160 1726853531.97365: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853531.97374: Calling all_plugins_play to load vars for managed_node1 24160 1726853531.97376: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853531.97379: Calling groups_plugins_play to load vars for managed_node1 24160 1726853531.97488: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853531.97607: done with get_vars() 24160 1726853531.97615: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 13:32:11 -0400 (0:00:00.017) 0:00:08.379 ****** 24160 1726853531.97678: entering _queue_task() for managed_node1/include_tasks 24160 1726853531.97853: worker is 1 (out of 1 available) 24160 1726853531.97864: exiting _queue_task() for managed_node1/include_tasks 24160 1726853531.97877: done queuing things up, now waiting for results queue to drain 24160 1726853531.97879: waiting for pending results... 24160 1726853531.98028: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 24160 1726853531.98100: in run() - task 02083763-bbaf-5676-4eb4-000000000017 24160 1726853531.98112: variable 'ansible_search_path' from source: unknown 24160 1726853531.98116: variable 'ansible_search_path' from source: unknown 24160 1726853531.98140: calling self._execute() 24160 1726853531.98199: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853531.98203: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853531.98212: variable 'omit' from source: magic vars 24160 1726853531.98456: variable 'ansible_distribution_major_version' from source: facts 24160 1726853531.98468: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853531.98476: _execute() done 24160 1726853531.98478: dumping result to json 24160 1726853531.98482: done dumping result, returning 24160 1726853531.98489: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [02083763-bbaf-5676-4eb4-000000000017] 24160 1726853531.98494: sending task result for task 02083763-bbaf-5676-4eb4-000000000017 24160 1726853531.98573: done sending task result for task 02083763-bbaf-5676-4eb4-000000000017 24160 1726853531.98576: WORKER PROCESS EXITING 24160 1726853531.98612: no more pending results, returning what we have 24160 1726853531.98616: in VariableManager get_vars() 24160 1726853531.98651: Calling all_inventory to load vars for managed_node1 24160 1726853531.98654: Calling groups_inventory to load vars for managed_node1 24160 1726853531.98656: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853531.98663: Calling all_plugins_play to load vars for managed_node1 24160 1726853531.98665: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853531.98667: Calling groups_plugins_play to load vars for managed_node1 24160 1726853531.98825: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853531.98940: done with get_vars() 24160 1726853531.98945: variable 'ansible_search_path' from source: unknown 24160 1726853531.98946: variable 'ansible_search_path' from source: unknown 24160 1726853531.98974: we have included files to process 24160 1726853531.98975: generating all_blocks data 24160 1726853531.98976: done generating all_blocks data 24160 1726853531.98979: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 24160 1726853531.98979: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 24160 1726853531.98981: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 24160 1726853531.99415: done processing included file 24160 1726853531.99416: iterating over new_blocks loaded from include file 24160 1726853531.99417: in VariableManager get_vars() 24160 1726853531.99431: done with get_vars() 24160 1726853531.99432: filtering new block on tags 24160 1726853531.99442: done filtering new block on tags 24160 1726853531.99444: in VariableManager get_vars() 24160 1726853531.99456: done with get_vars() 24160 1726853531.99457: filtering new block on tags 24160 1726853531.99468: done filtering new block on tags 24160 1726853531.99469: in VariableManager get_vars() 24160 1726853531.99483: done with get_vars() 24160 1726853531.99484: filtering new block on tags 24160 1726853531.99497: done filtering new block on tags 24160 1726853531.99498: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node1 24160 1726853531.99501: extending task lists for all hosts with included blocks 24160 1726853531.99939: done extending task lists 24160 1726853531.99940: done processing included files 24160 1726853531.99941: results queue empty 24160 1726853531.99941: checking for any_errors_fatal 24160 1726853531.99943: done checking for any_errors_fatal 24160 1726853531.99944: checking for max_fail_percentage 24160 1726853531.99944: done checking for max_fail_percentage 24160 1726853531.99945: checking to see if all hosts have failed and the running result is not ok 24160 1726853531.99946: done checking to see if all hosts have failed 24160 1726853531.99946: getting the remaining hosts for this loop 24160 1726853531.99947: done getting the remaining hosts for this loop 24160 1726853531.99948: getting the next task for host managed_node1 24160 1726853531.99951: done getting next task for host managed_node1 24160 1726853531.99953: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 24160 1726853531.99955: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853531.99962: getting variables 24160 1726853531.99962: in VariableManager get_vars() 24160 1726853531.99973: Calling all_inventory to load vars for managed_node1 24160 1726853531.99975: Calling groups_inventory to load vars for managed_node1 24160 1726853531.99976: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853531.99979: Calling all_plugins_play to load vars for managed_node1 24160 1726853531.99981: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853531.99982: Calling groups_plugins_play to load vars for managed_node1 24160 1726853532.00080: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853532.00194: done with get_vars() 24160 1726853532.00200: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 13:32:12 -0400 (0:00:00.025) 0:00:08.405 ****** 24160 1726853532.00247: entering _queue_task() for managed_node1/setup 24160 1726853532.00447: worker is 1 (out of 1 available) 24160 1726853532.00458: exiting _queue_task() for managed_node1/setup 24160 1726853532.00469: done queuing things up, now waiting for results queue to drain 24160 1726853532.00472: waiting for pending results... 24160 1726853532.00624: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 24160 1726853532.00710: in run() - task 02083763-bbaf-5676-4eb4-00000000038e 24160 1726853532.00721: variable 'ansible_search_path' from source: unknown 24160 1726853532.00724: variable 'ansible_search_path' from source: unknown 24160 1726853532.00749: calling self._execute() 24160 1726853532.00812: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853532.00816: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853532.00823: variable 'omit' from source: magic vars 24160 1726853532.01076: variable 'ansible_distribution_major_version' from source: facts 24160 1726853532.01086: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853532.01225: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24160 1726853532.02629: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24160 1726853532.02679: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24160 1726853532.02705: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24160 1726853532.02730: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24160 1726853532.02749: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24160 1726853532.02811: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853532.02831: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853532.02847: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853532.02881: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853532.02892: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853532.02929: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853532.02945: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853532.02965: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853532.02996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853532.03006: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853532.03112: variable '__network_required_facts' from source: role '' defaults 24160 1726853532.03119: variable 'ansible_facts' from source: unknown 24160 1726853532.03180: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 24160 1726853532.03183: when evaluation is False, skipping this task 24160 1726853532.03186: _execute() done 24160 1726853532.03190: dumping result to json 24160 1726853532.03193: done dumping result, returning 24160 1726853532.03196: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [02083763-bbaf-5676-4eb4-00000000038e] 24160 1726853532.03205: sending task result for task 02083763-bbaf-5676-4eb4-00000000038e 24160 1726853532.03281: done sending task result for task 02083763-bbaf-5676-4eb4-00000000038e 24160 1726853532.03283: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 24160 1726853532.03350: no more pending results, returning what we have 24160 1726853532.03353: results queue empty 24160 1726853532.03354: checking for any_errors_fatal 24160 1726853532.03355: done checking for any_errors_fatal 24160 1726853532.03355: checking for max_fail_percentage 24160 1726853532.03357: done checking for max_fail_percentage 24160 1726853532.03358: checking to see if all hosts have failed and the running result is not ok 24160 1726853532.03358: done checking to see if all hosts have failed 24160 1726853532.03359: getting the remaining hosts for this loop 24160 1726853532.03360: done getting the remaining hosts for this loop 24160 1726853532.03363: getting the next task for host managed_node1 24160 1726853532.03374: done getting next task for host managed_node1 24160 1726853532.03377: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 24160 1726853532.03381: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853532.03394: getting variables 24160 1726853532.03395: in VariableManager get_vars() 24160 1726853532.03427: Calling all_inventory to load vars for managed_node1 24160 1726853532.03431: Calling groups_inventory to load vars for managed_node1 24160 1726853532.03433: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853532.03440: Calling all_plugins_play to load vars for managed_node1 24160 1726853532.03443: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853532.03445: Calling groups_plugins_play to load vars for managed_node1 24160 1726853532.03577: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853532.03722: done with get_vars() 24160 1726853532.03729: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 13:32:12 -0400 (0:00:00.035) 0:00:08.440 ****** 24160 1726853532.03798: entering _queue_task() for managed_node1/stat 24160 1726853532.03984: worker is 1 (out of 1 available) 24160 1726853532.03996: exiting _queue_task() for managed_node1/stat 24160 1726853532.04008: done queuing things up, now waiting for results queue to drain 24160 1726853532.04010: waiting for pending results... 24160 1726853532.04160: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 24160 1726853532.04248: in run() - task 02083763-bbaf-5676-4eb4-000000000390 24160 1726853532.04261: variable 'ansible_search_path' from source: unknown 24160 1726853532.04264: variable 'ansible_search_path' from source: unknown 24160 1726853532.04292: calling self._execute() 24160 1726853532.04350: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853532.04353: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853532.04362: variable 'omit' from source: magic vars 24160 1726853532.04610: variable 'ansible_distribution_major_version' from source: facts 24160 1726853532.04619: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853532.04726: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24160 1726853532.04915: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24160 1726853532.04944: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24160 1726853532.04972: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24160 1726853532.04996: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24160 1726853532.05054: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24160 1726853532.05074: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24160 1726853532.05092: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853532.05115: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24160 1726853532.05174: variable '__network_is_ostree' from source: set_fact 24160 1726853532.05180: Evaluated conditional (not __network_is_ostree is defined): False 24160 1726853532.05183: when evaluation is False, skipping this task 24160 1726853532.05186: _execute() done 24160 1726853532.05188: dumping result to json 24160 1726853532.05192: done dumping result, returning 24160 1726853532.05198: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [02083763-bbaf-5676-4eb4-000000000390] 24160 1726853532.05202: sending task result for task 02083763-bbaf-5676-4eb4-000000000390 24160 1726853532.05283: done sending task result for task 02083763-bbaf-5676-4eb4-000000000390 24160 1726853532.05286: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 24160 1726853532.05372: no more pending results, returning what we have 24160 1726853532.05375: results queue empty 24160 1726853532.05376: checking for any_errors_fatal 24160 1726853532.05381: done checking for any_errors_fatal 24160 1726853532.05382: checking for max_fail_percentage 24160 1726853532.05383: done checking for max_fail_percentage 24160 1726853532.05384: checking to see if all hosts have failed and the running result is not ok 24160 1726853532.05384: done checking to see if all hosts have failed 24160 1726853532.05385: getting the remaining hosts for this loop 24160 1726853532.05386: done getting the remaining hosts for this loop 24160 1726853532.05389: getting the next task for host managed_node1 24160 1726853532.05394: done getting next task for host managed_node1 24160 1726853532.05397: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 24160 1726853532.05400: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853532.05412: getting variables 24160 1726853532.05413: in VariableManager get_vars() 24160 1726853532.05441: Calling all_inventory to load vars for managed_node1 24160 1726853532.05443: Calling groups_inventory to load vars for managed_node1 24160 1726853532.05446: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853532.05453: Calling all_plugins_play to load vars for managed_node1 24160 1726853532.05455: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853532.05457: Calling groups_plugins_play to load vars for managed_node1 24160 1726853532.05561: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853532.05682: done with get_vars() 24160 1726853532.05689: done getting variables 24160 1726853532.05725: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 13:32:12 -0400 (0:00:00.019) 0:00:08.460 ****** 24160 1726853532.05748: entering _queue_task() for managed_node1/set_fact 24160 1726853532.05928: worker is 1 (out of 1 available) 24160 1726853532.05941: exiting _queue_task() for managed_node1/set_fact 24160 1726853532.05953: done queuing things up, now waiting for results queue to drain 24160 1726853532.05954: waiting for pending results... 24160 1726853532.06107: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 24160 1726853532.06189: in run() - task 02083763-bbaf-5676-4eb4-000000000391 24160 1726853532.06200: variable 'ansible_search_path' from source: unknown 24160 1726853532.06204: variable 'ansible_search_path' from source: unknown 24160 1726853532.06229: calling self._execute() 24160 1726853532.06289: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853532.06293: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853532.06301: variable 'omit' from source: magic vars 24160 1726853532.06547: variable 'ansible_distribution_major_version' from source: facts 24160 1726853532.06557: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853532.06684: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24160 1726853532.07084: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24160 1726853532.07088: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24160 1726853532.07190: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24160 1726853532.07194: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24160 1726853532.07211: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24160 1726853532.07235: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24160 1726853532.07262: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853532.07289: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24160 1726853532.07368: variable '__network_is_ostree' from source: set_fact 24160 1726853532.07377: Evaluated conditional (not __network_is_ostree is defined): False 24160 1726853532.07380: when evaluation is False, skipping this task 24160 1726853532.07382: _execute() done 24160 1726853532.07386: dumping result to json 24160 1726853532.07389: done dumping result, returning 24160 1726853532.07405: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [02083763-bbaf-5676-4eb4-000000000391] 24160 1726853532.07407: sending task result for task 02083763-bbaf-5676-4eb4-000000000391 24160 1726853532.07569: done sending task result for task 02083763-bbaf-5676-4eb4-000000000391 skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 24160 1726853532.07617: no more pending results, returning what we have 24160 1726853532.07620: results queue empty 24160 1726853532.07621: checking for any_errors_fatal 24160 1726853532.07626: done checking for any_errors_fatal 24160 1726853532.07626: checking for max_fail_percentage 24160 1726853532.07628: done checking for max_fail_percentage 24160 1726853532.07628: checking to see if all hosts have failed and the running result is not ok 24160 1726853532.07629: done checking to see if all hosts have failed 24160 1726853532.07630: getting the remaining hosts for this loop 24160 1726853532.07631: done getting the remaining hosts for this loop 24160 1726853532.07634: getting the next task for host managed_node1 24160 1726853532.07641: done getting next task for host managed_node1 24160 1726853532.07644: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 24160 1726853532.07647: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853532.07659: getting variables 24160 1726853532.07660: in VariableManager get_vars() 24160 1726853532.07738: Calling all_inventory to load vars for managed_node1 24160 1726853532.07741: Calling groups_inventory to load vars for managed_node1 24160 1726853532.07744: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853532.07759: WORKER PROCESS EXITING 24160 1726853532.07768: Calling all_plugins_play to load vars for managed_node1 24160 1726853532.07773: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853532.07776: Calling groups_plugins_play to load vars for managed_node1 24160 1726853532.07977: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853532.08105: done with get_vars() 24160 1726853532.08112: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 13:32:12 -0400 (0:00:00.024) 0:00:08.484 ****** 24160 1726853532.08178: entering _queue_task() for managed_node1/service_facts 24160 1726853532.08179: Creating lock for service_facts 24160 1726853532.08362: worker is 1 (out of 1 available) 24160 1726853532.08376: exiting _queue_task() for managed_node1/service_facts 24160 1726853532.08386: done queuing things up, now waiting for results queue to drain 24160 1726853532.08388: waiting for pending results... 24160 1726853532.08538: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running 24160 1726853532.08620: in run() - task 02083763-bbaf-5676-4eb4-000000000393 24160 1726853532.08631: variable 'ansible_search_path' from source: unknown 24160 1726853532.08634: variable 'ansible_search_path' from source: unknown 24160 1726853532.08662: calling self._execute() 24160 1726853532.08720: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853532.08723: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853532.08735: variable 'omit' from source: magic vars 24160 1726853532.08980: variable 'ansible_distribution_major_version' from source: facts 24160 1726853532.08989: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853532.08995: variable 'omit' from source: magic vars 24160 1726853532.09037: variable 'omit' from source: magic vars 24160 1726853532.09064: variable 'omit' from source: magic vars 24160 1726853532.09096: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24160 1726853532.09121: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24160 1726853532.09136: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24160 1726853532.09148: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853532.09161: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853532.09185: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24160 1726853532.09188: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853532.09191: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853532.09255: Set connection var ansible_shell_executable to /bin/sh 24160 1726853532.09261: Set connection var ansible_pipelining to False 24160 1726853532.09264: Set connection var ansible_connection to ssh 24160 1726853532.09267: Set connection var ansible_shell_type to sh 24160 1726853532.09281: Set connection var ansible_module_compression to ZIP_DEFLATED 24160 1726853532.09284: Set connection var ansible_timeout to 10 24160 1726853532.09300: variable 'ansible_shell_executable' from source: unknown 24160 1726853532.09303: variable 'ansible_connection' from source: unknown 24160 1726853532.09306: variable 'ansible_module_compression' from source: unknown 24160 1726853532.09309: variable 'ansible_shell_type' from source: unknown 24160 1726853532.09311: variable 'ansible_shell_executable' from source: unknown 24160 1726853532.09313: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853532.09316: variable 'ansible_pipelining' from source: unknown 24160 1726853532.09318: variable 'ansible_timeout' from source: unknown 24160 1726853532.09322: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853532.09460: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 24160 1726853532.09468: variable 'omit' from source: magic vars 24160 1726853532.09475: starting attempt loop 24160 1726853532.09478: running the handler 24160 1726853532.09491: _low_level_execute_command(): starting 24160 1726853532.09501: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24160 1726853532.10393: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853532.11872: stdout chunk (state=3): >>>/root <<< 24160 1726853532.12012: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853532.12022: stdout chunk (state=3): >>><<< 24160 1726853532.12038: stderr chunk (state=3): >>><<< 24160 1726853532.12062: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853532.12083: _low_level_execute_command(): starting 24160 1726853532.12094: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853532.1206906-24623-24772123516286 `" && echo ansible-tmp-1726853532.1206906-24623-24772123516286="` echo /root/.ansible/tmp/ansible-tmp-1726853532.1206906-24623-24772123516286 `" ) && sleep 0' 24160 1726853532.12722: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853532.12737: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853532.12757: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853532.12783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24160 1726853532.12802: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 24160 1726853532.12812: stderr chunk (state=3): >>>debug2: match not found <<< 24160 1726853532.12832: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853532.12931: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853532.12965: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853532.13043: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853532.15069: stdout chunk (state=3): >>>ansible-tmp-1726853532.1206906-24623-24772123516286=/root/.ansible/tmp/ansible-tmp-1726853532.1206906-24623-24772123516286 <<< 24160 1726853532.15366: stdout chunk (state=3): >>><<< 24160 1726853532.15370: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853532.15375: stderr chunk (state=3): >>><<< 24160 1726853532.15378: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853532.1206906-24623-24772123516286=/root/.ansible/tmp/ansible-tmp-1726853532.1206906-24623-24772123516286 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853532.15380: variable 'ansible_module_compression' from source: unknown 24160 1726853532.15436: ANSIBALLZ: Using lock for service_facts 24160 1726853532.15445: ANSIBALLZ: Acquiring lock 24160 1726853532.15458: ANSIBALLZ: Lock acquired: 140302796881312 24160 1726853532.15582: ANSIBALLZ: Creating module 24160 1726853532.29779: ANSIBALLZ: Writing module into payload 24160 1726853532.29844: ANSIBALLZ: Writing module 24160 1726853532.29865: ANSIBALLZ: Renaming module 24160 1726853532.29873: ANSIBALLZ: Done creating module 24160 1726853532.29888: variable 'ansible_facts' from source: unknown 24160 1726853532.29938: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853532.1206906-24623-24772123516286/AnsiballZ_service_facts.py 24160 1726853532.30036: Sending initial data 24160 1726853532.30039: Sent initial data (161 bytes) 24160 1726853532.30501: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853532.30504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 24160 1726853532.30507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 24160 1726853532.30510: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24160 1726853532.30515: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853532.30622: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853532.30682: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853532.32302: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24160 1726853532.32338: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24160 1726853532.32384: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24160jdl187cr/tmpx0jx8g0a /root/.ansible/tmp/ansible-tmp-1726853532.1206906-24623-24772123516286/AnsiballZ_service_facts.py <<< 24160 1726853532.32387: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853532.1206906-24623-24772123516286/AnsiballZ_service_facts.py" <<< 24160 1726853532.32425: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24160jdl187cr/tmpx0jx8g0a" to remote "/root/.ansible/tmp/ansible-tmp-1726853532.1206906-24623-24772123516286/AnsiballZ_service_facts.py" <<< 24160 1726853532.32427: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853532.1206906-24623-24772123516286/AnsiballZ_service_facts.py" <<< 24160 1726853532.32960: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853532.32996: stderr chunk (state=3): >>><<< 24160 1726853532.32999: stdout chunk (state=3): >>><<< 24160 1726853532.33035: done transferring module to remote 24160 1726853532.33044: _low_level_execute_command(): starting 24160 1726853532.33049: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853532.1206906-24623-24772123516286/ /root/.ansible/tmp/ansible-tmp-1726853532.1206906-24623-24772123516286/AnsiballZ_service_facts.py && sleep 0' 24160 1726853532.33615: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853532.33653: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853532.35389: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853532.35410: stderr chunk (state=3): >>><<< 24160 1726853532.35414: stdout chunk (state=3): >>><<< 24160 1726853532.35427: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853532.35432: _low_level_execute_command(): starting 24160 1726853532.35443: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853532.1206906-24623-24772123516286/AnsiballZ_service_facts.py && sleep 0' 24160 1726853532.35859: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853532.35862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 24160 1726853532.35864: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 24160 1726853532.35867: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853532.35872: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853532.35920: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853532.35923: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853532.35975: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853533.89705: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 24160 1726853533.91075: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853533.91262: stderr chunk (state=3): >>>Shared connection to 10.31.45.153 closed. <<< 24160 1726853533.91266: stdout chunk (state=3): >>><<< 24160 1726853533.91268: stderr chunk (state=3): >>><<< 24160 1726853533.91291: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 24160 1726853533.95899: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853532.1206906-24623-24772123516286/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24160 1726853533.95904: _low_level_execute_command(): starting 24160 1726853533.95906: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853532.1206906-24623-24772123516286/ > /dev/null 2>&1 && sleep 0' 24160 1726853533.96850: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853533.96869: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853533.96968: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853533.97003: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853533.99069: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853533.99075: stdout chunk (state=3): >>><<< 24160 1726853533.99078: stderr chunk (state=3): >>><<< 24160 1726853533.99093: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853533.99477: handler run complete 24160 1726853533.99590: variable 'ansible_facts' from source: unknown 24160 1726853533.99775: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853534.00267: variable 'ansible_facts' from source: unknown 24160 1726853534.00406: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853534.00633: attempt loop complete, returning result 24160 1726853534.00643: _execute() done 24160 1726853534.00650: dumping result to json 24160 1726853534.00721: done dumping result, returning 24160 1726853534.00739: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running [02083763-bbaf-5676-4eb4-000000000393] 24160 1726853534.00749: sending task result for task 02083763-bbaf-5676-4eb4-000000000393 24160 1726853534.02136: done sending task result for task 02083763-bbaf-5676-4eb4-000000000393 24160 1726853534.02140: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 24160 1726853534.02243: no more pending results, returning what we have 24160 1726853534.02246: results queue empty 24160 1726853534.02247: checking for any_errors_fatal 24160 1726853534.02250: done checking for any_errors_fatal 24160 1726853534.02251: checking for max_fail_percentage 24160 1726853534.02252: done checking for max_fail_percentage 24160 1726853534.02253: checking to see if all hosts have failed and the running result is not ok 24160 1726853534.02256: done checking to see if all hosts have failed 24160 1726853534.02257: getting the remaining hosts for this loop 24160 1726853534.02258: done getting the remaining hosts for this loop 24160 1726853534.02261: getting the next task for host managed_node1 24160 1726853534.02266: done getting next task for host managed_node1 24160 1726853534.02269: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 24160 1726853534.02277: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853534.02287: getting variables 24160 1726853534.02289: in VariableManager get_vars() 24160 1726853534.02317: Calling all_inventory to load vars for managed_node1 24160 1726853534.02320: Calling groups_inventory to load vars for managed_node1 24160 1726853534.02322: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853534.02331: Calling all_plugins_play to load vars for managed_node1 24160 1726853534.02333: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853534.02336: Calling groups_plugins_play to load vars for managed_node1 24160 1726853534.03015: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853534.04047: done with get_vars() 24160 1726853534.04064: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 13:32:14 -0400 (0:00:01.959) 0:00:10.444 ****** 24160 1726853534.04388: entering _queue_task() for managed_node1/package_facts 24160 1726853534.04391: Creating lock for package_facts 24160 1726853534.04887: worker is 1 (out of 1 available) 24160 1726853534.04903: exiting _queue_task() for managed_node1/package_facts 24160 1726853534.04915: done queuing things up, now waiting for results queue to drain 24160 1726853534.04937: waiting for pending results... 24160 1726853534.05268: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 24160 1726853534.05342: in run() - task 02083763-bbaf-5676-4eb4-000000000394 24160 1726853534.05369: variable 'ansible_search_path' from source: unknown 24160 1726853534.05380: variable 'ansible_search_path' from source: unknown 24160 1726853534.05420: calling self._execute() 24160 1726853534.05510: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853534.05521: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853534.05534: variable 'omit' from source: magic vars 24160 1726853534.05917: variable 'ansible_distribution_major_version' from source: facts 24160 1726853534.05935: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853534.05949: variable 'omit' from source: magic vars 24160 1726853534.06051: variable 'omit' from source: magic vars 24160 1726853534.06060: variable 'omit' from source: magic vars 24160 1726853534.06105: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24160 1726853534.06148: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24160 1726853534.06179: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24160 1726853534.06201: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853534.06238: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853534.06255: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24160 1726853534.06269: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853534.06347: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853534.06383: Set connection var ansible_shell_executable to /bin/sh 24160 1726853534.06395: Set connection var ansible_pipelining to False 24160 1726853534.06402: Set connection var ansible_connection to ssh 24160 1726853534.06409: Set connection var ansible_shell_type to sh 24160 1726853534.06419: Set connection var ansible_module_compression to ZIP_DEFLATED 24160 1726853534.06432: Set connection var ansible_timeout to 10 24160 1726853534.06463: variable 'ansible_shell_executable' from source: unknown 24160 1726853534.06490: variable 'ansible_connection' from source: unknown 24160 1726853534.06499: variable 'ansible_module_compression' from source: unknown 24160 1726853534.06505: variable 'ansible_shell_type' from source: unknown 24160 1726853534.06511: variable 'ansible_shell_executable' from source: unknown 24160 1726853534.06517: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853534.06523: variable 'ansible_pipelining' from source: unknown 24160 1726853534.06529: variable 'ansible_timeout' from source: unknown 24160 1726853534.06536: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853534.06779: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 24160 1726853534.06813: variable 'omit' from source: magic vars 24160 1726853534.06816: starting attempt loop 24160 1726853534.06819: running the handler 24160 1726853534.06888: _low_level_execute_command(): starting 24160 1726853534.06895: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24160 1726853534.07663: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853534.07711: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853534.07730: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853534.07750: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853534.07843: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853534.09488: stdout chunk (state=3): >>>/root <<< 24160 1726853534.09620: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853534.09637: stderr chunk (state=3): >>><<< 24160 1726853534.09649: stdout chunk (state=3): >>><<< 24160 1726853534.09679: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853534.09777: _low_level_execute_command(): starting 24160 1726853534.09781: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853534.0968606-24701-20259451516406 `" && echo ansible-tmp-1726853534.0968606-24701-20259451516406="` echo /root/.ansible/tmp/ansible-tmp-1726853534.0968606-24701-20259451516406 `" ) && sleep 0' 24160 1726853534.10319: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853534.10333: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853534.10347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853534.10363: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24160 1726853534.10384: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 24160 1726853534.10425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853534.10504: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853534.10546: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853534.10610: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853534.12490: stdout chunk (state=3): >>>ansible-tmp-1726853534.0968606-24701-20259451516406=/root/.ansible/tmp/ansible-tmp-1726853534.0968606-24701-20259451516406 <<< 24160 1726853534.12650: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853534.12654: stdout chunk (state=3): >>><<< 24160 1726853534.12656: stderr chunk (state=3): >>><<< 24160 1726853534.12676: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853534.0968606-24701-20259451516406=/root/.ansible/tmp/ansible-tmp-1726853534.0968606-24701-20259451516406 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853534.12878: variable 'ansible_module_compression' from source: unknown 24160 1726853534.12881: ANSIBALLZ: Using lock for package_facts 24160 1726853534.12884: ANSIBALLZ: Acquiring lock 24160 1726853534.12886: ANSIBALLZ: Lock acquired: 140302793230864 24160 1726853534.12888: ANSIBALLZ: Creating module 24160 1726853534.48916: ANSIBALLZ: Writing module into payload 24160 1726853534.49477: ANSIBALLZ: Writing module 24160 1726853534.49481: ANSIBALLZ: Renaming module 24160 1726853534.49484: ANSIBALLZ: Done creating module 24160 1726853534.49486: variable 'ansible_facts' from source: unknown 24160 1726853534.49770: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853534.0968606-24701-20259451516406/AnsiballZ_package_facts.py 24160 1726853534.50098: Sending initial data 24160 1726853534.50102: Sent initial data (161 bytes) 24160 1726853534.51270: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853534.51490: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853534.51566: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853534.53222: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24160 1726853534.53279: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24160 1726853534.53319: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24160jdl187cr/tmpjs1cqi0e /root/.ansible/tmp/ansible-tmp-1726853534.0968606-24701-20259451516406/AnsiballZ_package_facts.py <<< 24160 1726853534.53332: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853534.0968606-24701-20259451516406/AnsiballZ_package_facts.py" <<< 24160 1726853534.53349: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24160jdl187cr/tmpjs1cqi0e" to remote "/root/.ansible/tmp/ansible-tmp-1726853534.0968606-24701-20259451516406/AnsiballZ_package_facts.py" <<< 24160 1726853534.53375: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853534.0968606-24701-20259451516406/AnsiballZ_package_facts.py" <<< 24160 1726853534.56174: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853534.56180: stderr chunk (state=3): >>><<< 24160 1726853534.56183: stdout chunk (state=3): >>><<< 24160 1726853534.56185: done transferring module to remote 24160 1726853534.56187: _low_level_execute_command(): starting 24160 1726853534.56189: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853534.0968606-24701-20259451516406/ /root/.ansible/tmp/ansible-tmp-1726853534.0968606-24701-20259451516406/AnsiballZ_package_facts.py && sleep 0' 24160 1726853534.57396: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853534.57401: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853534.57486: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853534.57557: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853534.59515: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853534.59544: stderr chunk (state=3): >>><<< 24160 1726853534.59553: stdout chunk (state=3): >>><<< 24160 1726853534.59857: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853534.59861: _low_level_execute_command(): starting 24160 1726853534.59864: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853534.0968606-24701-20259451516406/AnsiballZ_package_facts.py && sleep 0' 24160 1726853534.60659: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853534.60662: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853534.60688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853534.60770: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853534.60786: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853534.60813: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853534.60903: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853535.04662: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 24160 1726853535.04683: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 24160 1726853535.04722: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "rele<<< 24160 1726853535.04743: stdout chunk (state=3): >>>ase": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10",<<< 24160 1726853535.04754: stdout chunk (state=3): >>> "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "sou<<< 24160 1726853535.04766: stdout chunk (state=3): >>>rce": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.7<<< 24160 1726853535.04792: stdout chunk (state=3): >>>3.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"<<< 24160 1726853535.04796: stdout chunk (state=3): >>>}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-resc<<< 24160 1726853535.04815: stdout chunk (state=3): >>>ue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "r<<< 24160 1726853535.04868: stdout chunk (state=3): >>>pm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud<<< 24160 1726853535.04882: stdout chunk (state=3): >>>-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 24160 1726853535.06616: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 24160 1726853535.06650: stderr chunk (state=3): >>><<< 24160 1726853535.06653: stdout chunk (state=3): >>><<< 24160 1726853535.06690: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 24160 1726853535.08250: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853534.0968606-24701-20259451516406/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24160 1726853535.08269: _low_level_execute_command(): starting 24160 1726853535.08274: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853534.0968606-24701-20259451516406/ > /dev/null 2>&1 && sleep 0' 24160 1726853535.08722: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853535.08726: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853535.08737: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853535.08803: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853535.08810: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853535.08813: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853535.08851: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853535.10697: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853535.10725: stderr chunk (state=3): >>><<< 24160 1726853535.10728: stdout chunk (state=3): >>><<< 24160 1726853535.10741: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853535.10747: handler run complete 24160 1726853535.11202: variable 'ansible_facts' from source: unknown 24160 1726853535.11443: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853535.12528: variable 'ansible_facts' from source: unknown 24160 1726853535.15631: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853535.16011: attempt loop complete, returning result 24160 1726853535.16020: _execute() done 24160 1726853535.16023: dumping result to json 24160 1726853535.16136: done dumping result, returning 24160 1726853535.16142: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [02083763-bbaf-5676-4eb4-000000000394] 24160 1726853535.16145: sending task result for task 02083763-bbaf-5676-4eb4-000000000394 24160 1726853535.17396: done sending task result for task 02083763-bbaf-5676-4eb4-000000000394 24160 1726853535.17400: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 24160 1726853535.17443: no more pending results, returning what we have 24160 1726853535.17445: results queue empty 24160 1726853535.17445: checking for any_errors_fatal 24160 1726853535.17448: done checking for any_errors_fatal 24160 1726853535.17449: checking for max_fail_percentage 24160 1726853535.17450: done checking for max_fail_percentage 24160 1726853535.17450: checking to see if all hosts have failed and the running result is not ok 24160 1726853535.17451: done checking to see if all hosts have failed 24160 1726853535.17451: getting the remaining hosts for this loop 24160 1726853535.17452: done getting the remaining hosts for this loop 24160 1726853535.17455: getting the next task for host managed_node1 24160 1726853535.17459: done getting next task for host managed_node1 24160 1726853535.17461: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 24160 1726853535.17464: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853535.17470: getting variables 24160 1726853535.17472: in VariableManager get_vars() 24160 1726853535.17495: Calling all_inventory to load vars for managed_node1 24160 1726853535.17496: Calling groups_inventory to load vars for managed_node1 24160 1726853535.17498: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853535.17507: Calling all_plugins_play to load vars for managed_node1 24160 1726853535.17510: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853535.17513: Calling groups_plugins_play to load vars for managed_node1 24160 1726853535.18235: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853535.19155: done with get_vars() 24160 1726853535.19180: done getting variables 24160 1726853535.19236: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 13:32:15 -0400 (0:00:01.150) 0:00:11.595 ****** 24160 1726853535.19276: entering _queue_task() for managed_node1/debug 24160 1726853535.19584: worker is 1 (out of 1 available) 24160 1726853535.19598: exiting _queue_task() for managed_node1/debug 24160 1726853535.19611: done queuing things up, now waiting for results queue to drain 24160 1726853535.19612: waiting for pending results... 24160 1726853535.20037: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 24160 1726853535.20051: in run() - task 02083763-bbaf-5676-4eb4-000000000018 24160 1726853535.20076: variable 'ansible_search_path' from source: unknown 24160 1726853535.20085: variable 'ansible_search_path' from source: unknown 24160 1726853535.20125: calling self._execute() 24160 1726853535.20216: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853535.20231: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853535.20249: variable 'omit' from source: magic vars 24160 1726853535.20632: variable 'ansible_distribution_major_version' from source: facts 24160 1726853535.20650: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853535.20662: variable 'omit' from source: magic vars 24160 1726853535.20721: variable 'omit' from source: magic vars 24160 1726853535.20828: variable 'network_provider' from source: set_fact 24160 1726853535.20852: variable 'omit' from source: magic vars 24160 1726853535.20976: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24160 1726853535.20981: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24160 1726853535.20983: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24160 1726853535.20985: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853535.21002: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853535.21037: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24160 1726853535.21046: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853535.21054: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853535.21156: Set connection var ansible_shell_executable to /bin/sh 24160 1726853535.21167: Set connection var ansible_pipelining to False 24160 1726853535.21177: Set connection var ansible_connection to ssh 24160 1726853535.21184: Set connection var ansible_shell_type to sh 24160 1726853535.21196: Set connection var ansible_module_compression to ZIP_DEFLATED 24160 1726853535.21209: Set connection var ansible_timeout to 10 24160 1726853535.21238: variable 'ansible_shell_executable' from source: unknown 24160 1726853535.21247: variable 'ansible_connection' from source: unknown 24160 1726853535.21254: variable 'ansible_module_compression' from source: unknown 24160 1726853535.21261: variable 'ansible_shell_type' from source: unknown 24160 1726853535.21268: variable 'ansible_shell_executable' from source: unknown 24160 1726853535.21329: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853535.21332: variable 'ansible_pipelining' from source: unknown 24160 1726853535.21334: variable 'ansible_timeout' from source: unknown 24160 1726853535.21337: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853535.21453: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 24160 1726853535.21469: variable 'omit' from source: magic vars 24160 1726853535.21488: starting attempt loop 24160 1726853535.21492: running the handler 24160 1726853535.21547: handler run complete 24160 1726853535.21568: attempt loop complete, returning result 24160 1726853535.21573: _execute() done 24160 1726853535.21576: dumping result to json 24160 1726853535.21578: done dumping result, returning 24160 1726853535.21581: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [02083763-bbaf-5676-4eb4-000000000018] 24160 1726853535.21585: sending task result for task 02083763-bbaf-5676-4eb4-000000000018 24160 1726853535.21668: done sending task result for task 02083763-bbaf-5676-4eb4-000000000018 24160 1726853535.21673: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: Using network provider: nm 24160 1726853535.21733: no more pending results, returning what we have 24160 1726853535.21737: results queue empty 24160 1726853535.21738: checking for any_errors_fatal 24160 1726853535.21745: done checking for any_errors_fatal 24160 1726853535.21746: checking for max_fail_percentage 24160 1726853535.21747: done checking for max_fail_percentage 24160 1726853535.21749: checking to see if all hosts have failed and the running result is not ok 24160 1726853535.21749: done checking to see if all hosts have failed 24160 1726853535.21750: getting the remaining hosts for this loop 24160 1726853535.21751: done getting the remaining hosts for this loop 24160 1726853535.21757: getting the next task for host managed_node1 24160 1726853535.21764: done getting next task for host managed_node1 24160 1726853535.21767: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 24160 1726853535.21773: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853535.21784: getting variables 24160 1726853535.21785: in VariableManager get_vars() 24160 1726853535.21819: Calling all_inventory to load vars for managed_node1 24160 1726853535.21822: Calling groups_inventory to load vars for managed_node1 24160 1726853535.21824: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853535.21833: Calling all_plugins_play to load vars for managed_node1 24160 1726853535.21836: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853535.21838: Calling groups_plugins_play to load vars for managed_node1 24160 1726853535.22700: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853535.25317: done with get_vars() 24160 1726853535.25347: done getting variables 24160 1726853535.25410: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 13:32:15 -0400 (0:00:00.061) 0:00:11.657 ****** 24160 1726853535.25446: entering _queue_task() for managed_node1/fail 24160 1726853535.25755: worker is 1 (out of 1 available) 24160 1726853535.25767: exiting _queue_task() for managed_node1/fail 24160 1726853535.25981: done queuing things up, now waiting for results queue to drain 24160 1726853535.25983: waiting for pending results... 24160 1726853535.26042: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 24160 1726853535.26174: in run() - task 02083763-bbaf-5676-4eb4-000000000019 24160 1726853535.26208: variable 'ansible_search_path' from source: unknown 24160 1726853535.26211: variable 'ansible_search_path' from source: unknown 24160 1726853535.26244: calling self._execute() 24160 1726853535.26377: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853535.26381: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853535.26383: variable 'omit' from source: magic vars 24160 1726853535.26735: variable 'ansible_distribution_major_version' from source: facts 24160 1726853535.26757: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853535.26880: variable 'network_state' from source: role '' defaults 24160 1726853535.26894: Evaluated conditional (network_state != {}): False 24160 1726853535.26901: when evaluation is False, skipping this task 24160 1726853535.26908: _execute() done 24160 1726853535.26968: dumping result to json 24160 1726853535.26973: done dumping result, returning 24160 1726853535.26976: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [02083763-bbaf-5676-4eb4-000000000019] 24160 1726853535.26979: sending task result for task 02083763-bbaf-5676-4eb4-000000000019 24160 1726853535.27046: done sending task result for task 02083763-bbaf-5676-4eb4-000000000019 24160 1726853535.27049: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 24160 1726853535.27121: no more pending results, returning what we have 24160 1726853535.27125: results queue empty 24160 1726853535.27126: checking for any_errors_fatal 24160 1726853535.27133: done checking for any_errors_fatal 24160 1726853535.27133: checking for max_fail_percentage 24160 1726853535.27136: done checking for max_fail_percentage 24160 1726853535.27137: checking to see if all hosts have failed and the running result is not ok 24160 1726853535.27138: done checking to see if all hosts have failed 24160 1726853535.27138: getting the remaining hosts for this loop 24160 1726853535.27140: done getting the remaining hosts for this loop 24160 1726853535.27143: getting the next task for host managed_node1 24160 1726853535.27151: done getting next task for host managed_node1 24160 1726853535.27154: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 24160 1726853535.27158: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853535.27177: getting variables 24160 1726853535.27178: in VariableManager get_vars() 24160 1726853535.27215: Calling all_inventory to load vars for managed_node1 24160 1726853535.27218: Calling groups_inventory to load vars for managed_node1 24160 1726853535.27220: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853535.27232: Calling all_plugins_play to load vars for managed_node1 24160 1726853535.27236: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853535.27239: Calling groups_plugins_play to load vars for managed_node1 24160 1726853535.28859: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853535.31133: done with get_vars() 24160 1726853535.31159: done getting variables 24160 1726853535.31223: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 13:32:15 -0400 (0:00:00.058) 0:00:11.715 ****** 24160 1726853535.31259: entering _queue_task() for managed_node1/fail 24160 1726853535.32002: worker is 1 (out of 1 available) 24160 1726853535.32017: exiting _queue_task() for managed_node1/fail 24160 1726853535.32028: done queuing things up, now waiting for results queue to drain 24160 1726853535.32030: waiting for pending results... 24160 1726853535.32691: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 24160 1726853535.32924: in run() - task 02083763-bbaf-5676-4eb4-00000000001a 24160 1726853535.32940: variable 'ansible_search_path' from source: unknown 24160 1726853535.32943: variable 'ansible_search_path' from source: unknown 24160 1726853535.32985: calling self._execute() 24160 1726853535.33220: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853535.33224: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853535.33276: variable 'omit' from source: magic vars 24160 1726853535.34257: variable 'ansible_distribution_major_version' from source: facts 24160 1726853535.34261: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853535.34445: variable 'network_state' from source: role '' defaults 24160 1726853535.34518: Evaluated conditional (network_state != {}): False 24160 1726853535.34527: when evaluation is False, skipping this task 24160 1726853535.34533: _execute() done 24160 1726853535.34539: dumping result to json 24160 1726853535.34614: done dumping result, returning 24160 1726853535.34618: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [02083763-bbaf-5676-4eb4-00000000001a] 24160 1726853535.34620: sending task result for task 02083763-bbaf-5676-4eb4-00000000001a 24160 1726853535.35070: done sending task result for task 02083763-bbaf-5676-4eb4-00000000001a 24160 1726853535.35077: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 24160 1726853535.35126: no more pending results, returning what we have 24160 1726853535.35129: results queue empty 24160 1726853535.35131: checking for any_errors_fatal 24160 1726853535.35135: done checking for any_errors_fatal 24160 1726853535.35136: checking for max_fail_percentage 24160 1726853535.35138: done checking for max_fail_percentage 24160 1726853535.35139: checking to see if all hosts have failed and the running result is not ok 24160 1726853535.35140: done checking to see if all hosts have failed 24160 1726853535.35141: getting the remaining hosts for this loop 24160 1726853535.35142: done getting the remaining hosts for this loop 24160 1726853535.35146: getting the next task for host managed_node1 24160 1726853535.35152: done getting next task for host managed_node1 24160 1726853535.35155: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 24160 1726853535.35158: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853535.35176: getting variables 24160 1726853535.35178: in VariableManager get_vars() 24160 1726853535.35219: Calling all_inventory to load vars for managed_node1 24160 1726853535.35222: Calling groups_inventory to load vars for managed_node1 24160 1726853535.35225: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853535.35236: Calling all_plugins_play to load vars for managed_node1 24160 1726853535.35239: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853535.35242: Calling groups_plugins_play to load vars for managed_node1 24160 1726853535.38012: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853535.40420: done with get_vars() 24160 1726853535.40453: done getting variables 24160 1726853535.40521: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 13:32:15 -0400 (0:00:00.092) 0:00:11.808 ****** 24160 1726853535.40555: entering _queue_task() for managed_node1/fail 24160 1726853535.41082: worker is 1 (out of 1 available) 24160 1726853535.41093: exiting _queue_task() for managed_node1/fail 24160 1726853535.41103: done queuing things up, now waiting for results queue to drain 24160 1726853535.41105: waiting for pending results... 24160 1726853535.41233: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 24160 1726853535.41346: in run() - task 02083763-bbaf-5676-4eb4-00000000001b 24160 1726853535.41366: variable 'ansible_search_path' from source: unknown 24160 1726853535.41376: variable 'ansible_search_path' from source: unknown 24160 1726853535.41418: calling self._execute() 24160 1726853535.41526: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853535.41887: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853535.41891: variable 'omit' from source: magic vars 24160 1726853535.42267: variable 'ansible_distribution_major_version' from source: facts 24160 1726853535.42286: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853535.42632: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24160 1726853535.45986: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24160 1726853535.46061: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24160 1726853535.46130: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24160 1726853535.46210: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24160 1726853535.46245: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24160 1726853535.46320: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853535.46353: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853535.46383: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853535.46425: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853535.46448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853535.46536: variable 'ansible_distribution_major_version' from source: facts 24160 1726853535.46559: Evaluated conditional (ansible_distribution_major_version | int > 9): True 24160 1726853535.46684: variable 'ansible_distribution' from source: facts 24160 1726853535.46692: variable '__network_rh_distros' from source: role '' defaults 24160 1726853535.46709: Evaluated conditional (ansible_distribution in __network_rh_distros): True 24160 1726853535.47082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853535.47117: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853535.47276: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853535.47280: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853535.47282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853535.47284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853535.47576: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853535.47580: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853535.47582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853535.47585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853535.47702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853535.47734: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853535.47768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853535.47909: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853535.47946: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853535.48595: variable 'network_connections' from source: task vars 24160 1726853535.48803: variable 'interface' from source: set_fact 24160 1726853535.48806: variable 'interface' from source: set_fact 24160 1726853535.48808: variable 'interface' from source: set_fact 24160 1726853535.48944: variable 'interface' from source: set_fact 24160 1726853535.49033: variable 'network_state' from source: role '' defaults 24160 1726853535.49103: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24160 1726853535.49357: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24160 1726853535.49401: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24160 1726853535.49435: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24160 1726853535.49475: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24160 1726853535.49535: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24160 1726853535.49576: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24160 1726853535.49605: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853535.49633: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24160 1726853535.49679: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 24160 1726853535.49687: when evaluation is False, skipping this task 24160 1726853535.49694: _execute() done 24160 1726853535.49700: dumping result to json 24160 1726853535.49707: done dumping result, returning 24160 1726853535.49718: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [02083763-bbaf-5676-4eb4-00000000001b] 24160 1726853535.49727: sending task result for task 02083763-bbaf-5676-4eb4-00000000001b skipping: [managed_node1] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 24160 1726853535.49970: no more pending results, returning what we have 24160 1726853535.49976: results queue empty 24160 1726853535.49977: checking for any_errors_fatal 24160 1726853535.49982: done checking for any_errors_fatal 24160 1726853535.49983: checking for max_fail_percentage 24160 1726853535.49985: done checking for max_fail_percentage 24160 1726853535.49986: checking to see if all hosts have failed and the running result is not ok 24160 1726853535.49987: done checking to see if all hosts have failed 24160 1726853535.49987: getting the remaining hosts for this loop 24160 1726853535.49989: done getting the remaining hosts for this loop 24160 1726853535.49993: getting the next task for host managed_node1 24160 1726853535.49999: done getting next task for host managed_node1 24160 1726853535.50003: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 24160 1726853535.50006: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853535.50020: getting variables 24160 1726853535.50022: in VariableManager get_vars() 24160 1726853535.50062: Calling all_inventory to load vars for managed_node1 24160 1726853535.50065: Calling groups_inventory to load vars for managed_node1 24160 1726853535.50067: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853535.50079: Calling all_plugins_play to load vars for managed_node1 24160 1726853535.50082: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853535.50085: Calling groups_plugins_play to load vars for managed_node1 24160 1726853535.50684: done sending task result for task 02083763-bbaf-5676-4eb4-00000000001b 24160 1726853535.50687: WORKER PROCESS EXITING 24160 1726853535.52253: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853535.54018: done with get_vars() 24160 1726853535.54049: done getting variables 24160 1726853535.54156: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 13:32:15 -0400 (0:00:00.136) 0:00:11.944 ****** 24160 1726853535.54190: entering _queue_task() for managed_node1/dnf 24160 1726853535.54526: worker is 1 (out of 1 available) 24160 1726853535.54540: exiting _queue_task() for managed_node1/dnf 24160 1726853535.54552: done queuing things up, now waiting for results queue to drain 24160 1726853535.54554: waiting for pending results... 24160 1726853535.54822: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 24160 1726853535.54945: in run() - task 02083763-bbaf-5676-4eb4-00000000001c 24160 1726853535.54965: variable 'ansible_search_path' from source: unknown 24160 1726853535.54975: variable 'ansible_search_path' from source: unknown 24160 1726853535.55016: calling self._execute() 24160 1726853535.55109: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853535.55121: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853535.55135: variable 'omit' from source: magic vars 24160 1726853535.55505: variable 'ansible_distribution_major_version' from source: facts 24160 1726853535.55522: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853535.55717: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24160 1726853535.57920: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24160 1726853535.57999: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24160 1726853535.58045: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24160 1726853535.58138: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24160 1726853535.58141: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24160 1726853535.58183: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853535.58212: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853535.58237: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853535.58284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853535.58306: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853535.58433: variable 'ansible_distribution' from source: facts 24160 1726853535.58444: variable 'ansible_distribution_major_version' from source: facts 24160 1726853535.58469: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 24160 1726853535.58776: variable '__network_wireless_connections_defined' from source: role '' defaults 24160 1726853535.58779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853535.58782: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853535.58784: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853535.58814: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853535.58835: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853535.58880: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853535.58915: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853535.58945: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853535.58989: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853535.59013: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853535.59058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853535.59090: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853535.59124: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853535.59168: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853535.59190: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853535.59353: variable 'network_connections' from source: task vars 24160 1726853535.59372: variable 'interface' from source: set_fact 24160 1726853535.59444: variable 'interface' from source: set_fact 24160 1726853535.59459: variable 'interface' from source: set_fact 24160 1726853535.59520: variable 'interface' from source: set_fact 24160 1726853535.59599: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24160 1726853535.59766: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24160 1726853535.59812: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24160 1726853535.59878: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24160 1726853535.59899: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24160 1726853535.59944: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24160 1726853535.59986: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24160 1726853535.60096: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853535.60099: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24160 1726853535.60109: variable '__network_team_connections_defined' from source: role '' defaults 24160 1726853535.60342: variable 'network_connections' from source: task vars 24160 1726853535.60353: variable 'interface' from source: set_fact 24160 1726853535.60421: variable 'interface' from source: set_fact 24160 1726853535.60433: variable 'interface' from source: set_fact 24160 1726853535.60495: variable 'interface' from source: set_fact 24160 1726853535.60534: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 24160 1726853535.60542: when evaluation is False, skipping this task 24160 1726853535.60549: _execute() done 24160 1726853535.60556: dumping result to json 24160 1726853535.60563: done dumping result, returning 24160 1726853535.60578: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [02083763-bbaf-5676-4eb4-00000000001c] 24160 1726853535.60588: sending task result for task 02083763-bbaf-5676-4eb4-00000000001c 24160 1726853535.60713: done sending task result for task 02083763-bbaf-5676-4eb4-00000000001c 24160 1726853535.60717: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 24160 1726853535.60792: no more pending results, returning what we have 24160 1726853535.60796: results queue empty 24160 1726853535.60797: checking for any_errors_fatal 24160 1726853535.60802: done checking for any_errors_fatal 24160 1726853535.60803: checking for max_fail_percentage 24160 1726853535.60805: done checking for max_fail_percentage 24160 1726853535.60805: checking to see if all hosts have failed and the running result is not ok 24160 1726853535.60806: done checking to see if all hosts have failed 24160 1726853535.60807: getting the remaining hosts for this loop 24160 1726853535.60808: done getting the remaining hosts for this loop 24160 1726853535.60812: getting the next task for host managed_node1 24160 1726853535.60818: done getting next task for host managed_node1 24160 1726853535.60822: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 24160 1726853535.60825: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853535.60838: getting variables 24160 1726853535.60839: in VariableManager get_vars() 24160 1726853535.60878: Calling all_inventory to load vars for managed_node1 24160 1726853535.60880: Calling groups_inventory to load vars for managed_node1 24160 1726853535.60883: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853535.60892: Calling all_plugins_play to load vars for managed_node1 24160 1726853535.60895: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853535.60897: Calling groups_plugins_play to load vars for managed_node1 24160 1726853535.62421: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853535.64037: done with get_vars() 24160 1726853535.64060: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 24160 1726853535.64134: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 13:32:15 -0400 (0:00:00.099) 0:00:12.044 ****** 24160 1726853535.64166: entering _queue_task() for managed_node1/yum 24160 1726853535.64168: Creating lock for yum 24160 1726853535.64592: worker is 1 (out of 1 available) 24160 1726853535.64604: exiting _queue_task() for managed_node1/yum 24160 1726853535.64615: done queuing things up, now waiting for results queue to drain 24160 1726853535.64616: waiting for pending results... 24160 1726853535.65087: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 24160 1726853535.65092: in run() - task 02083763-bbaf-5676-4eb4-00000000001d 24160 1726853535.65096: variable 'ansible_search_path' from source: unknown 24160 1726853535.65098: variable 'ansible_search_path' from source: unknown 24160 1726853535.65101: calling self._execute() 24160 1726853535.65104: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853535.65107: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853535.65109: variable 'omit' from source: magic vars 24160 1726853535.65461: variable 'ansible_distribution_major_version' from source: facts 24160 1726853535.65481: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853535.65658: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24160 1726853535.67865: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24160 1726853535.67947: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24160 1726853535.67988: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24160 1726853535.68026: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24160 1726853535.68061: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24160 1726853535.68139: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853535.68178: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853535.68207: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853535.68254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853535.68279: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853535.68376: variable 'ansible_distribution_major_version' from source: facts 24160 1726853535.68396: Evaluated conditional (ansible_distribution_major_version | int < 8): False 24160 1726853535.68404: when evaluation is False, skipping this task 24160 1726853535.68410: _execute() done 24160 1726853535.68418: dumping result to json 24160 1726853535.68425: done dumping result, returning 24160 1726853535.68435: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [02083763-bbaf-5676-4eb4-00000000001d] 24160 1726853535.68443: sending task result for task 02083763-bbaf-5676-4eb4-00000000001d skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 24160 1726853535.68631: no more pending results, returning what we have 24160 1726853535.68635: results queue empty 24160 1726853535.68636: checking for any_errors_fatal 24160 1726853535.68641: done checking for any_errors_fatal 24160 1726853535.68642: checking for max_fail_percentage 24160 1726853535.68644: done checking for max_fail_percentage 24160 1726853535.68645: checking to see if all hosts have failed and the running result is not ok 24160 1726853535.68645: done checking to see if all hosts have failed 24160 1726853535.68646: getting the remaining hosts for this loop 24160 1726853535.68648: done getting the remaining hosts for this loop 24160 1726853535.68652: getting the next task for host managed_node1 24160 1726853535.68659: done getting next task for host managed_node1 24160 1726853535.68663: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 24160 1726853535.68666: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853535.68682: getting variables 24160 1726853535.68683: in VariableManager get_vars() 24160 1726853535.68722: Calling all_inventory to load vars for managed_node1 24160 1726853535.68724: Calling groups_inventory to load vars for managed_node1 24160 1726853535.68727: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853535.68737: Calling all_plugins_play to load vars for managed_node1 24160 1726853535.68740: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853535.68742: Calling groups_plugins_play to load vars for managed_node1 24160 1726853535.69484: done sending task result for task 02083763-bbaf-5676-4eb4-00000000001d 24160 1726853535.69488: WORKER PROCESS EXITING 24160 1726853535.70250: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853535.71601: done with get_vars() 24160 1726853535.71618: done getting variables 24160 1726853535.71663: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 13:32:15 -0400 (0:00:00.075) 0:00:12.119 ****** 24160 1726853535.71690: entering _queue_task() for managed_node1/fail 24160 1726853535.71925: worker is 1 (out of 1 available) 24160 1726853535.71940: exiting _queue_task() for managed_node1/fail 24160 1726853535.71952: done queuing things up, now waiting for results queue to drain 24160 1726853535.71954: waiting for pending results... 24160 1726853535.72136: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 24160 1726853535.72222: in run() - task 02083763-bbaf-5676-4eb4-00000000001e 24160 1726853535.72234: variable 'ansible_search_path' from source: unknown 24160 1726853535.72238: variable 'ansible_search_path' from source: unknown 24160 1726853535.72284: calling self._execute() 24160 1726853535.72346: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853535.72350: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853535.72362: variable 'omit' from source: magic vars 24160 1726853535.72632: variable 'ansible_distribution_major_version' from source: facts 24160 1726853535.72643: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853535.72727: variable '__network_wireless_connections_defined' from source: role '' defaults 24160 1726853535.72852: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24160 1726853535.74697: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24160 1726853535.74746: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24160 1726853535.74776: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24160 1726853535.74804: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24160 1726853535.74823: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24160 1726853535.74883: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853535.74908: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853535.74925: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853535.74950: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853535.74964: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853535.74998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853535.75017: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853535.75034: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853535.75061: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853535.75073: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853535.75102: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853535.75122: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853535.75136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853535.75162: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853535.75173: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853535.75285: variable 'network_connections' from source: task vars 24160 1726853535.75295: variable 'interface' from source: set_fact 24160 1726853535.75347: variable 'interface' from source: set_fact 24160 1726853535.75355: variable 'interface' from source: set_fact 24160 1726853535.75400: variable 'interface' from source: set_fact 24160 1726853535.75447: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24160 1726853535.75560: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24160 1726853535.75588: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24160 1726853535.75620: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24160 1726853535.75641: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24160 1726853535.75675: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24160 1726853535.75691: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24160 1726853535.75708: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853535.75726: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24160 1726853535.75770: variable '__network_team_connections_defined' from source: role '' defaults 24160 1726853535.75919: variable 'network_connections' from source: task vars 24160 1726853535.75923: variable 'interface' from source: set_fact 24160 1726853535.75964: variable 'interface' from source: set_fact 24160 1726853535.75968: variable 'interface' from source: set_fact 24160 1726853535.76013: variable 'interface' from source: set_fact 24160 1726853535.76035: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 24160 1726853535.76039: when evaluation is False, skipping this task 24160 1726853535.76041: _execute() done 24160 1726853535.76043: dumping result to json 24160 1726853535.76046: done dumping result, returning 24160 1726853535.76056: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [02083763-bbaf-5676-4eb4-00000000001e] 24160 1726853535.76066: sending task result for task 02083763-bbaf-5676-4eb4-00000000001e 24160 1726853535.76151: done sending task result for task 02083763-bbaf-5676-4eb4-00000000001e 24160 1726853535.76156: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 24160 1726853535.76215: no more pending results, returning what we have 24160 1726853535.76219: results queue empty 24160 1726853535.76220: checking for any_errors_fatal 24160 1726853535.76225: done checking for any_errors_fatal 24160 1726853535.76226: checking for max_fail_percentage 24160 1726853535.76227: done checking for max_fail_percentage 24160 1726853535.76228: checking to see if all hosts have failed and the running result is not ok 24160 1726853535.76229: done checking to see if all hosts have failed 24160 1726853535.76229: getting the remaining hosts for this loop 24160 1726853535.76232: done getting the remaining hosts for this loop 24160 1726853535.76235: getting the next task for host managed_node1 24160 1726853535.76242: done getting next task for host managed_node1 24160 1726853535.76245: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 24160 1726853535.76248: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853535.76263: getting variables 24160 1726853535.76266: in VariableManager get_vars() 24160 1726853535.76336: Calling all_inventory to load vars for managed_node1 24160 1726853535.76339: Calling groups_inventory to load vars for managed_node1 24160 1726853535.76341: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853535.76349: Calling all_plugins_play to load vars for managed_node1 24160 1726853535.76352: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853535.76357: Calling groups_plugins_play to load vars for managed_node1 24160 1726853535.77862: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853535.84065: done with get_vars() 24160 1726853535.84090: done getting variables 24160 1726853535.84139: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 13:32:15 -0400 (0:00:00.124) 0:00:12.244 ****** 24160 1726853535.84178: entering _queue_task() for managed_node1/package 24160 1726853535.84676: worker is 1 (out of 1 available) 24160 1726853535.84688: exiting _queue_task() for managed_node1/package 24160 1726853535.84700: done queuing things up, now waiting for results queue to drain 24160 1726853535.84701: waiting for pending results... 24160 1726853535.85518: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 24160 1726853535.85523: in run() - task 02083763-bbaf-5676-4eb4-00000000001f 24160 1726853535.85590: variable 'ansible_search_path' from source: unknown 24160 1726853535.85598: variable 'ansible_search_path' from source: unknown 24160 1726853535.85643: calling self._execute() 24160 1726853535.85979: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853535.85982: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853535.86086: variable 'omit' from source: magic vars 24160 1726853535.86667: variable 'ansible_distribution_major_version' from source: facts 24160 1726853535.86752: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853535.87058: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24160 1726853535.87669: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24160 1726853535.87929: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24160 1726853535.87988: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24160 1726853535.88028: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24160 1726853535.88478: variable 'network_packages' from source: role '' defaults 24160 1726853535.88597: variable '__network_provider_setup' from source: role '' defaults 24160 1726853535.88613: variable '__network_service_name_default_nm' from source: role '' defaults 24160 1726853535.88687: variable '__network_service_name_default_nm' from source: role '' defaults 24160 1726853535.88804: variable '__network_packages_default_nm' from source: role '' defaults 24160 1726853535.88857: variable '__network_packages_default_nm' from source: role '' defaults 24160 1726853535.89282: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24160 1726853535.93417: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24160 1726853535.93704: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24160 1726853535.93708: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24160 1726853535.93710: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24160 1726853535.93798: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24160 1726853535.94138: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853535.94142: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853535.94144: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853535.94146: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853535.94182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853535.94229: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853535.94383: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853535.94412: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853535.94460: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853535.94676: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853535.95011: variable '__network_packages_default_gobject_packages' from source: role '' defaults 24160 1726853535.95276: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853535.95306: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853535.95475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853535.95500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853535.95518: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853535.95611: variable 'ansible_python' from source: facts 24160 1726853535.95877: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 24160 1726853535.95880: variable '__network_wpa_supplicant_required' from source: role '' defaults 24160 1726853535.96050: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 24160 1726853535.96298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853535.96402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853535.96436: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853535.96533: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853535.96555: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853535.96638: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853535.96858: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853535.96861: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853535.96863: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853535.96865: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853535.97213: variable 'network_connections' from source: task vars 24160 1726853535.97225: variable 'interface' from source: set_fact 24160 1726853535.97335: variable 'interface' from source: set_fact 24160 1726853535.97349: variable 'interface' from source: set_fact 24160 1726853535.97458: variable 'interface' from source: set_fact 24160 1726853535.97537: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24160 1726853535.97575: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24160 1726853535.97618: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853535.97651: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24160 1726853535.97707: variable '__network_wireless_connections_defined' from source: role '' defaults 24160 1726853535.98011: variable 'network_connections' from source: task vars 24160 1726853535.98021: variable 'interface' from source: set_fact 24160 1726853535.98125: variable 'interface' from source: set_fact 24160 1726853535.98140: variable 'interface' from source: set_fact 24160 1726853535.98244: variable 'interface' from source: set_fact 24160 1726853535.98309: variable '__network_packages_default_wireless' from source: role '' defaults 24160 1726853535.98401: variable '__network_wireless_connections_defined' from source: role '' defaults 24160 1726853535.98715: variable 'network_connections' from source: task vars 24160 1726853535.98726: variable 'interface' from source: set_fact 24160 1726853535.98806: variable 'interface' from source: set_fact 24160 1726853535.98809: variable 'interface' from source: set_fact 24160 1726853535.98876: variable 'interface' from source: set_fact 24160 1726853535.98915: variable '__network_packages_default_team' from source: role '' defaults 24160 1726853535.99024: variable '__network_team_connections_defined' from source: role '' defaults 24160 1726853535.99309: variable 'network_connections' from source: task vars 24160 1726853535.99318: variable 'interface' from source: set_fact 24160 1726853535.99389: variable 'interface' from source: set_fact 24160 1726853535.99401: variable 'interface' from source: set_fact 24160 1726853535.99473: variable 'interface' from source: set_fact 24160 1726853535.99569: variable '__network_service_name_default_initscripts' from source: role '' defaults 24160 1726853535.99607: variable '__network_service_name_default_initscripts' from source: role '' defaults 24160 1726853535.99620: variable '__network_packages_default_initscripts' from source: role '' defaults 24160 1726853535.99689: variable '__network_packages_default_initscripts' from source: role '' defaults 24160 1726853535.99930: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 24160 1726853536.00429: variable 'network_connections' from source: task vars 24160 1726853536.00549: variable 'interface' from source: set_fact 24160 1726853536.00552: variable 'interface' from source: set_fact 24160 1726853536.00556: variable 'interface' from source: set_fact 24160 1726853536.00580: variable 'interface' from source: set_fact 24160 1726853536.00593: variable 'ansible_distribution' from source: facts 24160 1726853536.00601: variable '__network_rh_distros' from source: role '' defaults 24160 1726853536.00610: variable 'ansible_distribution_major_version' from source: facts 24160 1726853536.00636: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 24160 1726853536.00808: variable 'ansible_distribution' from source: facts 24160 1726853536.00817: variable '__network_rh_distros' from source: role '' defaults 24160 1726853536.00826: variable 'ansible_distribution_major_version' from source: facts 24160 1726853536.00841: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 24160 1726853536.01010: variable 'ansible_distribution' from source: facts 24160 1726853536.01019: variable '__network_rh_distros' from source: role '' defaults 24160 1726853536.01028: variable 'ansible_distribution_major_version' from source: facts 24160 1726853536.01068: variable 'network_provider' from source: set_fact 24160 1726853536.01093: variable 'ansible_facts' from source: unknown 24160 1726853536.02099: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 24160 1726853536.02102: when evaluation is False, skipping this task 24160 1726853536.02105: _execute() done 24160 1726853536.02107: dumping result to json 24160 1726853536.02109: done dumping result, returning 24160 1726853536.02112: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [02083763-bbaf-5676-4eb4-00000000001f] 24160 1726853536.02114: sending task result for task 02083763-bbaf-5676-4eb4-00000000001f 24160 1726853536.02188: done sending task result for task 02083763-bbaf-5676-4eb4-00000000001f 24160 1726853536.02191: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 24160 1726853536.02252: no more pending results, returning what we have 24160 1726853536.02259: results queue empty 24160 1726853536.02260: checking for any_errors_fatal 24160 1726853536.02270: done checking for any_errors_fatal 24160 1726853536.02274: checking for max_fail_percentage 24160 1726853536.02276: done checking for max_fail_percentage 24160 1726853536.02277: checking to see if all hosts have failed and the running result is not ok 24160 1726853536.02278: done checking to see if all hosts have failed 24160 1726853536.02278: getting the remaining hosts for this loop 24160 1726853536.02280: done getting the remaining hosts for this loop 24160 1726853536.02285: getting the next task for host managed_node1 24160 1726853536.02293: done getting next task for host managed_node1 24160 1726853536.02297: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 24160 1726853536.02300: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853536.02319: getting variables 24160 1726853536.02321: in VariableManager get_vars() 24160 1726853536.02363: Calling all_inventory to load vars for managed_node1 24160 1726853536.02366: Calling groups_inventory to load vars for managed_node1 24160 1726853536.02369: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853536.02383: Calling all_plugins_play to load vars for managed_node1 24160 1726853536.02386: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853536.02389: Calling groups_plugins_play to load vars for managed_node1 24160 1726853536.04008: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853536.06205: done with get_vars() 24160 1726853536.06228: done getting variables 24160 1726853536.06291: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 13:32:16 -0400 (0:00:00.221) 0:00:12.465 ****** 24160 1726853536.06324: entering _queue_task() for managed_node1/package 24160 1726853536.06879: worker is 1 (out of 1 available) 24160 1726853536.06890: exiting _queue_task() for managed_node1/package 24160 1726853536.06902: done queuing things up, now waiting for results queue to drain 24160 1726853536.06903: waiting for pending results... 24160 1726853536.07261: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 24160 1726853536.07416: in run() - task 02083763-bbaf-5676-4eb4-000000000020 24160 1726853536.07438: variable 'ansible_search_path' from source: unknown 24160 1726853536.07447: variable 'ansible_search_path' from source: unknown 24160 1726853536.07498: calling self._execute() 24160 1726853536.07604: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853536.07616: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853536.07629: variable 'omit' from source: magic vars 24160 1726853536.08579: variable 'ansible_distribution_major_version' from source: facts 24160 1726853536.08582: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853536.08585: variable 'network_state' from source: role '' defaults 24160 1726853536.08696: Evaluated conditional (network_state != {}): False 24160 1726853536.08704: when evaluation is False, skipping this task 24160 1726853536.08711: _execute() done 24160 1726853536.08717: dumping result to json 24160 1726853536.08723: done dumping result, returning 24160 1726853536.08805: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [02083763-bbaf-5676-4eb4-000000000020] 24160 1726853536.08816: sending task result for task 02083763-bbaf-5676-4eb4-000000000020 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 24160 1726853536.08989: no more pending results, returning what we have 24160 1726853536.08993: results queue empty 24160 1726853536.08995: checking for any_errors_fatal 24160 1726853536.09002: done checking for any_errors_fatal 24160 1726853536.09002: checking for max_fail_percentage 24160 1726853536.09005: done checking for max_fail_percentage 24160 1726853536.09006: checking to see if all hosts have failed and the running result is not ok 24160 1726853536.09006: done checking to see if all hosts have failed 24160 1726853536.09007: getting the remaining hosts for this loop 24160 1726853536.09009: done getting the remaining hosts for this loop 24160 1726853536.09013: getting the next task for host managed_node1 24160 1726853536.09021: done getting next task for host managed_node1 24160 1726853536.09025: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 24160 1726853536.09029: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853536.09050: getting variables 24160 1726853536.09052: in VariableManager get_vars() 24160 1726853536.09096: Calling all_inventory to load vars for managed_node1 24160 1726853536.09099: Calling groups_inventory to load vars for managed_node1 24160 1726853536.09102: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853536.09113: Calling all_plugins_play to load vars for managed_node1 24160 1726853536.09116: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853536.09118: Calling groups_plugins_play to load vars for managed_node1 24160 1726853536.09853: done sending task result for task 02083763-bbaf-5676-4eb4-000000000020 24160 1726853536.09858: WORKER PROCESS EXITING 24160 1726853536.11137: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853536.13339: done with get_vars() 24160 1726853536.13369: done getting variables 24160 1726853536.13432: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 13:32:16 -0400 (0:00:00.071) 0:00:12.537 ****** 24160 1726853536.13470: entering _queue_task() for managed_node1/package 24160 1726853536.13879: worker is 1 (out of 1 available) 24160 1726853536.13893: exiting _queue_task() for managed_node1/package 24160 1726853536.13907: done queuing things up, now waiting for results queue to drain 24160 1726853536.13909: waiting for pending results... 24160 1726853536.14167: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 24160 1726853536.14477: in run() - task 02083763-bbaf-5676-4eb4-000000000021 24160 1726853536.14481: variable 'ansible_search_path' from source: unknown 24160 1726853536.14483: variable 'ansible_search_path' from source: unknown 24160 1726853536.14486: calling self._execute() 24160 1726853536.14489: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853536.14492: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853536.14495: variable 'omit' from source: magic vars 24160 1726853536.14843: variable 'ansible_distribution_major_version' from source: facts 24160 1726853536.14857: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853536.14968: variable 'network_state' from source: role '' defaults 24160 1726853536.14981: Evaluated conditional (network_state != {}): False 24160 1726853536.14988: when evaluation is False, skipping this task 24160 1726853536.14991: _execute() done 24160 1726853536.14994: dumping result to json 24160 1726853536.14998: done dumping result, returning 24160 1726853536.15007: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [02083763-bbaf-5676-4eb4-000000000021] 24160 1726853536.15012: sending task result for task 02083763-bbaf-5676-4eb4-000000000021 24160 1726853536.15116: done sending task result for task 02083763-bbaf-5676-4eb4-000000000021 24160 1726853536.15118: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 24160 1726853536.15199: no more pending results, returning what we have 24160 1726853536.15203: results queue empty 24160 1726853536.15204: checking for any_errors_fatal 24160 1726853536.15212: done checking for any_errors_fatal 24160 1726853536.15213: checking for max_fail_percentage 24160 1726853536.15214: done checking for max_fail_percentage 24160 1726853536.15215: checking to see if all hosts have failed and the running result is not ok 24160 1726853536.15216: done checking to see if all hosts have failed 24160 1726853536.15216: getting the remaining hosts for this loop 24160 1726853536.15218: done getting the remaining hosts for this loop 24160 1726853536.15221: getting the next task for host managed_node1 24160 1726853536.15228: done getting next task for host managed_node1 24160 1726853536.15231: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 24160 1726853536.15234: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853536.15248: getting variables 24160 1726853536.15249: in VariableManager get_vars() 24160 1726853536.15292: Calling all_inventory to load vars for managed_node1 24160 1726853536.15295: Calling groups_inventory to load vars for managed_node1 24160 1726853536.15297: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853536.15305: Calling all_plugins_play to load vars for managed_node1 24160 1726853536.15308: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853536.15310: Calling groups_plugins_play to load vars for managed_node1 24160 1726853536.16568: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853536.18131: done with get_vars() 24160 1726853536.18159: done getting variables 24160 1726853536.18261: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 13:32:16 -0400 (0:00:00.048) 0:00:12.585 ****** 24160 1726853536.18293: entering _queue_task() for managed_node1/service 24160 1726853536.18295: Creating lock for service 24160 1726853536.18604: worker is 1 (out of 1 available) 24160 1726853536.18617: exiting _queue_task() for managed_node1/service 24160 1726853536.18628: done queuing things up, now waiting for results queue to drain 24160 1726853536.18630: waiting for pending results... 24160 1726853536.19089: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 24160 1726853536.19094: in run() - task 02083763-bbaf-5676-4eb4-000000000022 24160 1726853536.19097: variable 'ansible_search_path' from source: unknown 24160 1726853536.19100: variable 'ansible_search_path' from source: unknown 24160 1726853536.19118: calling self._execute() 24160 1726853536.19227: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853536.19239: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853536.19256: variable 'omit' from source: magic vars 24160 1726853536.19649: variable 'ansible_distribution_major_version' from source: facts 24160 1726853536.19669: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853536.19793: variable '__network_wireless_connections_defined' from source: role '' defaults 24160 1726853536.19990: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24160 1726853536.22529: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24160 1726853536.22592: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24160 1726853536.22650: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24160 1726853536.22686: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24160 1726853536.22710: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24160 1726853536.22793: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853536.22821: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853536.22846: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853536.22895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853536.22910: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853536.22957: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853536.22988: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853536.23076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853536.23079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853536.23082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853536.23113: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853536.23136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853536.23160: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853536.23204: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853536.23219: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853536.23395: variable 'network_connections' from source: task vars 24160 1726853536.23412: variable 'interface' from source: set_fact 24160 1726853536.23487: variable 'interface' from source: set_fact 24160 1726853536.23676: variable 'interface' from source: set_fact 24160 1726853536.23679: variable 'interface' from source: set_fact 24160 1726853536.23682: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24160 1726853536.23799: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24160 1726853536.23842: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24160 1726853536.23873: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24160 1726853536.23911: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24160 1726853536.23959: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24160 1726853536.23983: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24160 1726853536.24008: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853536.24033: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24160 1726853536.24097: variable '__network_team_connections_defined' from source: role '' defaults 24160 1726853536.24339: variable 'network_connections' from source: task vars 24160 1726853536.24345: variable 'interface' from source: set_fact 24160 1726853536.24412: variable 'interface' from source: set_fact 24160 1726853536.24418: variable 'interface' from source: set_fact 24160 1726853536.24479: variable 'interface' from source: set_fact 24160 1726853536.24515: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 24160 1726853536.24519: when evaluation is False, skipping this task 24160 1726853536.24521: _execute() done 24160 1726853536.24524: dumping result to json 24160 1726853536.24526: done dumping result, returning 24160 1726853536.24535: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [02083763-bbaf-5676-4eb4-000000000022] 24160 1726853536.24545: sending task result for task 02083763-bbaf-5676-4eb4-000000000022 24160 1726853536.24629: done sending task result for task 02083763-bbaf-5676-4eb4-000000000022 24160 1726853536.24634: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 24160 1726853536.24686: no more pending results, returning what we have 24160 1726853536.24690: results queue empty 24160 1726853536.24691: checking for any_errors_fatal 24160 1726853536.24701: done checking for any_errors_fatal 24160 1726853536.24702: checking for max_fail_percentage 24160 1726853536.24704: done checking for max_fail_percentage 24160 1726853536.24704: checking to see if all hosts have failed and the running result is not ok 24160 1726853536.24705: done checking to see if all hosts have failed 24160 1726853536.24706: getting the remaining hosts for this loop 24160 1726853536.24708: done getting the remaining hosts for this loop 24160 1726853536.24711: getting the next task for host managed_node1 24160 1726853536.24718: done getting next task for host managed_node1 24160 1726853536.24722: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 24160 1726853536.24725: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853536.24740: getting variables 24160 1726853536.24741: in VariableManager get_vars() 24160 1726853536.24782: Calling all_inventory to load vars for managed_node1 24160 1726853536.24785: Calling groups_inventory to load vars for managed_node1 24160 1726853536.24788: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853536.24798: Calling all_plugins_play to load vars for managed_node1 24160 1726853536.24801: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853536.24804: Calling groups_plugins_play to load vars for managed_node1 24160 1726853536.26607: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853536.28228: done with get_vars() 24160 1726853536.28255: done getting variables 24160 1726853536.28316: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 13:32:16 -0400 (0:00:00.100) 0:00:12.686 ****** 24160 1726853536.28355: entering _queue_task() for managed_node1/service 24160 1726853536.28787: worker is 1 (out of 1 available) 24160 1726853536.28796: exiting _queue_task() for managed_node1/service 24160 1726853536.28807: done queuing things up, now waiting for results queue to drain 24160 1726853536.28809: waiting for pending results... 24160 1726853536.29189: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 24160 1726853536.29195: in run() - task 02083763-bbaf-5676-4eb4-000000000023 24160 1726853536.29198: variable 'ansible_search_path' from source: unknown 24160 1726853536.29201: variable 'ansible_search_path' from source: unknown 24160 1726853536.29204: calling self._execute() 24160 1726853536.29273: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853536.29278: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853536.29288: variable 'omit' from source: magic vars 24160 1726853536.29670: variable 'ansible_distribution_major_version' from source: facts 24160 1726853536.29688: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853536.29856: variable 'network_provider' from source: set_fact 24160 1726853536.29860: variable 'network_state' from source: role '' defaults 24160 1726853536.29868: Evaluated conditional (network_provider == "nm" or network_state != {}): True 24160 1726853536.29877: variable 'omit' from source: magic vars 24160 1726853536.29932: variable 'omit' from source: magic vars 24160 1726853536.29962: variable 'network_service_name' from source: role '' defaults 24160 1726853536.30036: variable 'network_service_name' from source: role '' defaults 24160 1726853536.30143: variable '__network_provider_setup' from source: role '' defaults 24160 1726853536.30150: variable '__network_service_name_default_nm' from source: role '' defaults 24160 1726853536.30212: variable '__network_service_name_default_nm' from source: role '' defaults 24160 1726853536.30220: variable '__network_packages_default_nm' from source: role '' defaults 24160 1726853536.30288: variable '__network_packages_default_nm' from source: role '' defaults 24160 1726853536.30516: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24160 1726853536.32654: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24160 1726853536.32707: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24160 1726853536.32737: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24160 1726853536.32763: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24160 1726853536.32785: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24160 1726853536.32848: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853536.32877: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853536.32896: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853536.32922: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853536.32932: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853536.32968: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853536.32986: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853536.33003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853536.33026: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853536.33037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853536.33185: variable '__network_packages_default_gobject_packages' from source: role '' defaults 24160 1726853536.33274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853536.33291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853536.33308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853536.33332: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853536.33342: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853536.33404: variable 'ansible_python' from source: facts 24160 1726853536.33425: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 24160 1726853536.33483: variable '__network_wpa_supplicant_required' from source: role '' defaults 24160 1726853536.33536: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 24160 1726853536.33619: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853536.33637: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853536.33653: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853536.33681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853536.33691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853536.33724: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853536.33744: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853536.33762: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853536.33788: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853536.33798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853536.33891: variable 'network_connections' from source: task vars 24160 1726853536.33897: variable 'interface' from source: set_fact 24160 1726853536.33949: variable 'interface' from source: set_fact 24160 1726853536.33961: variable 'interface' from source: set_fact 24160 1726853536.34011: variable 'interface' from source: set_fact 24160 1726853536.34084: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24160 1726853536.34228: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24160 1726853536.34266: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24160 1726853536.34297: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24160 1726853536.34327: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24160 1726853536.34374: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24160 1726853536.34394: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24160 1726853536.34576: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853536.34579: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24160 1726853536.34582: variable '__network_wireless_connections_defined' from source: role '' defaults 24160 1726853536.34730: variable 'network_connections' from source: task vars 24160 1726853536.34735: variable 'interface' from source: set_fact 24160 1726853536.34790: variable 'interface' from source: set_fact 24160 1726853536.34798: variable 'interface' from source: set_fact 24160 1726853536.34849: variable 'interface' from source: set_fact 24160 1726853536.34885: variable '__network_packages_default_wireless' from source: role '' defaults 24160 1726853536.34942: variable '__network_wireless_connections_defined' from source: role '' defaults 24160 1726853536.35125: variable 'network_connections' from source: task vars 24160 1726853536.35130: variable 'interface' from source: set_fact 24160 1726853536.35181: variable 'interface' from source: set_fact 24160 1726853536.35187: variable 'interface' from source: set_fact 24160 1726853536.35233: variable 'interface' from source: set_fact 24160 1726853536.35252: variable '__network_packages_default_team' from source: role '' defaults 24160 1726853536.35307: variable '__network_team_connections_defined' from source: role '' defaults 24160 1726853536.35546: variable 'network_connections' from source: task vars 24160 1726853536.35549: variable 'interface' from source: set_fact 24160 1726853536.35613: variable 'interface' from source: set_fact 24160 1726853536.35618: variable 'interface' from source: set_fact 24160 1726853536.35682: variable 'interface' from source: set_fact 24160 1726853536.35731: variable '__network_service_name_default_initscripts' from source: role '' defaults 24160 1726853536.35793: variable '__network_service_name_default_initscripts' from source: role '' defaults 24160 1726853536.35800: variable '__network_packages_default_initscripts' from source: role '' defaults 24160 1726853536.35872: variable '__network_packages_default_initscripts' from source: role '' defaults 24160 1726853536.36078: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 24160 1726853536.36386: variable 'network_connections' from source: task vars 24160 1726853536.36389: variable 'interface' from source: set_fact 24160 1726853536.36432: variable 'interface' from source: set_fact 24160 1726853536.36438: variable 'interface' from source: set_fact 24160 1726853536.36481: variable 'interface' from source: set_fact 24160 1726853536.36488: variable 'ansible_distribution' from source: facts 24160 1726853536.36491: variable '__network_rh_distros' from source: role '' defaults 24160 1726853536.36497: variable 'ansible_distribution_major_version' from source: facts 24160 1726853536.36513: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 24160 1726853536.36625: variable 'ansible_distribution' from source: facts 24160 1726853536.36628: variable '__network_rh_distros' from source: role '' defaults 24160 1726853536.36632: variable 'ansible_distribution_major_version' from source: facts 24160 1726853536.36644: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 24160 1726853536.36750: variable 'ansible_distribution' from source: facts 24160 1726853536.36753: variable '__network_rh_distros' from source: role '' defaults 24160 1726853536.36761: variable 'ansible_distribution_major_version' from source: facts 24160 1726853536.36787: variable 'network_provider' from source: set_fact 24160 1726853536.36804: variable 'omit' from source: magic vars 24160 1726853536.36823: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24160 1726853536.36843: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24160 1726853536.36860: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24160 1726853536.36874: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853536.36883: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853536.36905: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24160 1726853536.36908: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853536.36911: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853536.36979: Set connection var ansible_shell_executable to /bin/sh 24160 1726853536.36984: Set connection var ansible_pipelining to False 24160 1726853536.36987: Set connection var ansible_connection to ssh 24160 1726853536.36989: Set connection var ansible_shell_type to sh 24160 1726853536.36996: Set connection var ansible_module_compression to ZIP_DEFLATED 24160 1726853536.37003: Set connection var ansible_timeout to 10 24160 1726853536.37020: variable 'ansible_shell_executable' from source: unknown 24160 1726853536.37023: variable 'ansible_connection' from source: unknown 24160 1726853536.37025: variable 'ansible_module_compression' from source: unknown 24160 1726853536.37028: variable 'ansible_shell_type' from source: unknown 24160 1726853536.37031: variable 'ansible_shell_executable' from source: unknown 24160 1726853536.37033: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853536.37039: variable 'ansible_pipelining' from source: unknown 24160 1726853536.37041: variable 'ansible_timeout' from source: unknown 24160 1726853536.37043: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853536.37114: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 24160 1726853536.37121: variable 'omit' from source: magic vars 24160 1726853536.37127: starting attempt loop 24160 1726853536.37130: running the handler 24160 1726853536.37186: variable 'ansible_facts' from source: unknown 24160 1726853536.37735: _low_level_execute_command(): starting 24160 1726853536.37741: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24160 1726853536.38239: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853536.38243: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853536.38246: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853536.38248: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 24160 1726853536.38250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853536.38301: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853536.38306: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853536.38308: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853536.38360: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853536.40042: stdout chunk (state=3): >>>/root <<< 24160 1726853536.40140: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853536.40170: stderr chunk (state=3): >>><<< 24160 1726853536.40176: stdout chunk (state=3): >>><<< 24160 1726853536.40194: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853536.40205: _low_level_execute_command(): starting 24160 1726853536.40210: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853536.401933-24803-133543306382292 `" && echo ansible-tmp-1726853536.401933-24803-133543306382292="` echo /root/.ansible/tmp/ansible-tmp-1726853536.401933-24803-133543306382292 `" ) && sleep 0' 24160 1726853536.40639: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853536.40670: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24160 1726853536.40676: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 24160 1726853536.40679: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853536.40682: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853536.40685: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853536.40687: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853536.40736: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853536.40743: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853536.40745: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853536.40784: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853536.42651: stdout chunk (state=3): >>>ansible-tmp-1726853536.401933-24803-133543306382292=/root/.ansible/tmp/ansible-tmp-1726853536.401933-24803-133543306382292 <<< 24160 1726853536.42803: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853536.42805: stdout chunk (state=3): >>><<< 24160 1726853536.42807: stderr chunk (state=3): >>><<< 24160 1726853536.42818: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853536.401933-24803-133543306382292=/root/.ansible/tmp/ansible-tmp-1726853536.401933-24803-133543306382292 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853536.42976: variable 'ansible_module_compression' from source: unknown 24160 1726853536.42983: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 24160 1726853536.42986: ANSIBALLZ: Acquiring lock 24160 1726853536.42988: ANSIBALLZ: Lock acquired: 140302803944608 24160 1726853536.42990: ANSIBALLZ: Creating module 24160 1726853536.75579: ANSIBALLZ: Writing module into payload 24160 1726853536.75721: ANSIBALLZ: Writing module 24160 1726853536.75725: ANSIBALLZ: Renaming module 24160 1726853536.75727: ANSIBALLZ: Done creating module 24160 1726853536.75759: variable 'ansible_facts' from source: unknown 24160 1726853536.75978: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853536.401933-24803-133543306382292/AnsiballZ_systemd.py 24160 1726853536.76094: Sending initial data 24160 1726853536.76098: Sent initial data (155 bytes) 24160 1726853536.76785: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853536.76799: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853536.76810: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853536.76994: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853536.77144: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853536.79107: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24160 1726853536.79145: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24160 1726853536.79189: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24160jdl187cr/tmpoh7uxrsr /root/.ansible/tmp/ansible-tmp-1726853536.401933-24803-133543306382292/AnsiballZ_systemd.py <<< 24160 1726853536.79193: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853536.401933-24803-133543306382292/AnsiballZ_systemd.py" <<< 24160 1726853536.79230: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24160jdl187cr/tmpoh7uxrsr" to remote "/root/.ansible/tmp/ansible-tmp-1726853536.401933-24803-133543306382292/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853536.401933-24803-133543306382292/AnsiballZ_systemd.py" <<< 24160 1726853536.81812: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853536.81816: stdout chunk (state=3): >>><<< 24160 1726853536.81822: stderr chunk (state=3): >>><<< 24160 1726853536.81882: done transferring module to remote 24160 1726853536.81893: _low_level_execute_command(): starting 24160 1726853536.81898: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853536.401933-24803-133543306382292/ /root/.ansible/tmp/ansible-tmp-1726853536.401933-24803-133543306382292/AnsiballZ_systemd.py && sleep 0' 24160 1726853536.83217: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853536.83330: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853536.83364: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853536.83555: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853536.85284: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853536.85327: stderr chunk (state=3): >>><<< 24160 1726853536.85487: stdout chunk (state=3): >>><<< 24160 1726853536.85502: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853536.85506: _low_level_execute_command(): starting 24160 1726853536.85570: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853536.401933-24803-133543306382292/AnsiballZ_systemd.py && sleep 0' 24160 1726853536.86568: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853536.86637: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 24160 1726853536.86788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853536.86809: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853536.86823: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853536.86831: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853536.86903: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853537.15783: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ExecMainStartTimestampMonotonic": "13747067", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ExecMainHandoffTimestampMonotonic": "13825256", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "10678272", "MemoryPeak": "14561280", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3328217088", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "1079983000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpR<<< 24160 1726853537.15793: stdout chunk (state=3): >>>eceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service shutdown.target multi-user.target network.target cloud-init.service", "After": "cloud-<<< 24160 1726853537.15839: stdout chunk (state=3): >>>init-local.service systemd-journald.socket sysinit.target dbus.socket dbus-broker.service basic.target system.slice network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:27:21 EDT", "StateChangeTimestampMonotonic": "407641563", "InactiveExitTimestamp": "Fri 2024-09-20 13:20:47 EDT", "InactiveExitTimestampMonotonic": "13748890", "ActiveEnterTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ActiveEnterTimestampMonotonic": "14166608", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ConditionTimestampMonotonic": "13745559", "AssertTimestamp": "Fri 2024-09-20 13:20:47 EDT", "AssertTimestampMonotonic": "13745562", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "5f58decfa480494eac8aa3993b4c7ec8", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 24160 1726853537.17618: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853537.17630: stderr chunk (state=3): >>>Shared connection to 10.31.45.153 closed. <<< 24160 1726853537.17704: stderr chunk (state=3): >>><<< 24160 1726853537.17713: stdout chunk (state=3): >>><<< 24160 1726853537.17738: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ExecMainStartTimestampMonotonic": "13747067", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ExecMainHandoffTimestampMonotonic": "13825256", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "10678272", "MemoryPeak": "14561280", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3328217088", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "1079983000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service shutdown.target multi-user.target network.target cloud-init.service", "After": "cloud-init-local.service systemd-journald.socket sysinit.target dbus.socket dbus-broker.service basic.target system.slice network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:27:21 EDT", "StateChangeTimestampMonotonic": "407641563", "InactiveExitTimestamp": "Fri 2024-09-20 13:20:47 EDT", "InactiveExitTimestampMonotonic": "13748890", "ActiveEnterTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ActiveEnterTimestampMonotonic": "14166608", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ConditionTimestampMonotonic": "13745559", "AssertTimestamp": "Fri 2024-09-20 13:20:47 EDT", "AssertTimestampMonotonic": "13745562", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "5f58decfa480494eac8aa3993b4c7ec8", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 24160 1726853537.18274: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853536.401933-24803-133543306382292/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24160 1726853537.18277: _low_level_execute_command(): starting 24160 1726853537.18280: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853536.401933-24803-133543306382292/ > /dev/null 2>&1 && sleep 0' 24160 1726853537.19592: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853537.19639: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853537.19661: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853537.19723: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853537.21613: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853537.21616: stdout chunk (state=3): >>><<< 24160 1726853537.21619: stderr chunk (state=3): >>><<< 24160 1726853537.21777: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853537.21780: handler run complete 24160 1726853537.21883: attempt loop complete, returning result 24160 1726853537.21887: _execute() done 24160 1726853537.21891: dumping result to json 24160 1726853537.21893: done dumping result, returning 24160 1726853537.21896: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [02083763-bbaf-5676-4eb4-000000000023] 24160 1726853537.21898: sending task result for task 02083763-bbaf-5676-4eb4-000000000023 24160 1726853537.23089: done sending task result for task 02083763-bbaf-5676-4eb4-000000000023 24160 1726853537.23092: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 24160 1726853537.23143: no more pending results, returning what we have 24160 1726853537.23146: results queue empty 24160 1726853537.23148: checking for any_errors_fatal 24160 1726853537.23154: done checking for any_errors_fatal 24160 1726853537.23155: checking for max_fail_percentage 24160 1726853537.23157: done checking for max_fail_percentage 24160 1726853537.23158: checking to see if all hosts have failed and the running result is not ok 24160 1726853537.23158: done checking to see if all hosts have failed 24160 1726853537.23159: getting the remaining hosts for this loop 24160 1726853537.23160: done getting the remaining hosts for this loop 24160 1726853537.23164: getting the next task for host managed_node1 24160 1726853537.23195: done getting next task for host managed_node1 24160 1726853537.23199: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 24160 1726853537.23203: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853537.23214: getting variables 24160 1726853537.23216: in VariableManager get_vars() 24160 1726853537.23253: Calling all_inventory to load vars for managed_node1 24160 1726853537.23256: Calling groups_inventory to load vars for managed_node1 24160 1726853537.23258: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853537.23270: Calling all_plugins_play to load vars for managed_node1 24160 1726853537.23636: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853537.23640: Calling groups_plugins_play to load vars for managed_node1 24160 1726853537.26939: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853537.30345: done with get_vars() 24160 1726853537.30378: done getting variables 24160 1726853537.30438: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 13:32:17 -0400 (0:00:01.021) 0:00:13.707 ****** 24160 1726853537.30474: entering _queue_task() for managed_node1/service 24160 1726853537.31124: worker is 1 (out of 1 available) 24160 1726853537.31252: exiting _queue_task() for managed_node1/service 24160 1726853537.31264: done queuing things up, now waiting for results queue to drain 24160 1726853537.31266: waiting for pending results... 24160 1726853537.31677: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 24160 1726853537.31986: in run() - task 02083763-bbaf-5676-4eb4-000000000024 24160 1726853537.32001: variable 'ansible_search_path' from source: unknown 24160 1726853537.32004: variable 'ansible_search_path' from source: unknown 24160 1726853537.32080: calling self._execute() 24160 1726853537.32135: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853537.32139: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853537.32151: variable 'omit' from source: magic vars 24160 1726853537.32954: variable 'ansible_distribution_major_version' from source: facts 24160 1726853537.32972: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853537.33289: variable 'network_provider' from source: set_fact 24160 1726853537.33295: Evaluated conditional (network_provider == "nm"): True 24160 1726853537.33592: variable '__network_wpa_supplicant_required' from source: role '' defaults 24160 1726853537.33679: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 24160 1726853537.34078: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24160 1726853537.38138: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24160 1726853537.38410: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24160 1726853537.38447: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24160 1726853537.38493: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24160 1726853537.38531: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24160 1726853537.38806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853537.38835: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853537.38965: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853537.38969: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853537.38973: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853537.38976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853537.39189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853537.39213: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853537.39251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853537.39268: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853537.39308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853537.39331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853537.39354: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853537.39599: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853537.39632: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853537.39750: variable 'network_connections' from source: task vars 24160 1726853537.39767: variable 'interface' from source: set_fact 24160 1726853537.40066: variable 'interface' from source: set_fact 24160 1726853537.40070: variable 'interface' from source: set_fact 24160 1726853537.40117: variable 'interface' from source: set_fact 24160 1726853537.40476: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24160 1726853537.40554: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24160 1726853537.40796: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24160 1726853537.40827: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24160 1726853537.40854: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24160 1726853537.40899: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24160 1726853537.40919: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24160 1726853537.40943: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853537.40973: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24160 1726853537.41222: variable '__network_wireless_connections_defined' from source: role '' defaults 24160 1726853537.41677: variable 'network_connections' from source: task vars 24160 1726853537.41876: variable 'interface' from source: set_fact 24160 1726853537.41879: variable 'interface' from source: set_fact 24160 1726853537.41881: variable 'interface' from source: set_fact 24160 1726853537.42004: variable 'interface' from source: set_fact 24160 1726853537.42043: Evaluated conditional (__network_wpa_supplicant_required): False 24160 1726853537.42046: when evaluation is False, skipping this task 24160 1726853537.42048: _execute() done 24160 1726853537.42058: dumping result to json 24160 1726853537.42060: done dumping result, returning 24160 1726853537.42067: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [02083763-bbaf-5676-4eb4-000000000024] 24160 1726853537.42073: sending task result for task 02083763-bbaf-5676-4eb4-000000000024 24160 1726853537.42165: done sending task result for task 02083763-bbaf-5676-4eb4-000000000024 24160 1726853537.42167: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 24160 1726853537.42253: no more pending results, returning what we have 24160 1726853537.42256: results queue empty 24160 1726853537.42257: checking for any_errors_fatal 24160 1726853537.42284: done checking for any_errors_fatal 24160 1726853537.42285: checking for max_fail_percentage 24160 1726853537.42286: done checking for max_fail_percentage 24160 1726853537.42287: checking to see if all hosts have failed and the running result is not ok 24160 1726853537.42288: done checking to see if all hosts have failed 24160 1726853537.42289: getting the remaining hosts for this loop 24160 1726853537.42290: done getting the remaining hosts for this loop 24160 1726853537.42294: getting the next task for host managed_node1 24160 1726853537.42301: done getting next task for host managed_node1 24160 1726853537.42305: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 24160 1726853537.42308: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853537.42321: getting variables 24160 1726853537.42323: in VariableManager get_vars() 24160 1726853537.42360: Calling all_inventory to load vars for managed_node1 24160 1726853537.42363: Calling groups_inventory to load vars for managed_node1 24160 1726853537.42365: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853537.42381: Calling all_plugins_play to load vars for managed_node1 24160 1726853537.42384: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853537.42386: Calling groups_plugins_play to load vars for managed_node1 24160 1726853537.45139: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853537.48364: done with get_vars() 24160 1726853537.48402: done getting variables 24160 1726853537.48474: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 13:32:17 -0400 (0:00:00.180) 0:00:13.887 ****** 24160 1726853537.48510: entering _queue_task() for managed_node1/service 24160 1726853537.48919: worker is 1 (out of 1 available) 24160 1726853537.48931: exiting _queue_task() for managed_node1/service 24160 1726853537.49060: done queuing things up, now waiting for results queue to drain 24160 1726853537.49062: waiting for pending results... 24160 1726853537.49302: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 24160 1726853537.49410: in run() - task 02083763-bbaf-5676-4eb4-000000000025 24160 1726853537.49431: variable 'ansible_search_path' from source: unknown 24160 1726853537.49474: variable 'ansible_search_path' from source: unknown 24160 1726853537.49545: calling self._execute() 24160 1726853537.49713: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853537.49719: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853537.49732: variable 'omit' from source: magic vars 24160 1726853537.50644: variable 'ansible_distribution_major_version' from source: facts 24160 1726853537.50906: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853537.50919: variable 'network_provider' from source: set_fact 24160 1726853537.50930: Evaluated conditional (network_provider == "initscripts"): False 24160 1726853537.50939: when evaluation is False, skipping this task 24160 1726853537.50981: _execute() done 24160 1726853537.50990: dumping result to json 24160 1726853537.50998: done dumping result, returning 24160 1726853537.51016: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [02083763-bbaf-5676-4eb4-000000000025] 24160 1726853537.51027: sending task result for task 02083763-bbaf-5676-4eb4-000000000025 skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 24160 1726853537.51332: no more pending results, returning what we have 24160 1726853537.51336: results queue empty 24160 1726853537.51337: checking for any_errors_fatal 24160 1726853537.51346: done checking for any_errors_fatal 24160 1726853537.51347: checking for max_fail_percentage 24160 1726853537.51349: done checking for max_fail_percentage 24160 1726853537.51350: checking to see if all hosts have failed and the running result is not ok 24160 1726853537.51351: done checking to see if all hosts have failed 24160 1726853537.51352: getting the remaining hosts for this loop 24160 1726853537.51353: done getting the remaining hosts for this loop 24160 1726853537.51357: getting the next task for host managed_node1 24160 1726853537.51364: done getting next task for host managed_node1 24160 1726853537.51368: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 24160 1726853537.51375: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853537.51392: getting variables 24160 1726853537.51393: in VariableManager get_vars() 24160 1726853537.51432: Calling all_inventory to load vars for managed_node1 24160 1726853537.51436: Calling groups_inventory to load vars for managed_node1 24160 1726853537.51438: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853537.51451: Calling all_plugins_play to load vars for managed_node1 24160 1726853537.51454: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853537.51457: Calling groups_plugins_play to load vars for managed_node1 24160 1726853537.52099: done sending task result for task 02083763-bbaf-5676-4eb4-000000000025 24160 1726853537.52103: WORKER PROCESS EXITING 24160 1726853537.54730: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853537.58218: done with get_vars() 24160 1726853537.58249: done getting variables 24160 1726853537.58421: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 13:32:17 -0400 (0:00:00.099) 0:00:13.987 ****** 24160 1726853537.58459: entering _queue_task() for managed_node1/copy 24160 1726853537.59239: worker is 1 (out of 1 available) 24160 1726853537.59253: exiting _queue_task() for managed_node1/copy 24160 1726853537.59267: done queuing things up, now waiting for results queue to drain 24160 1726853537.59269: waiting for pending results... 24160 1726853537.59943: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 24160 1726853537.60061: in run() - task 02083763-bbaf-5676-4eb4-000000000026 24160 1726853537.60075: variable 'ansible_search_path' from source: unknown 24160 1726853537.60481: variable 'ansible_search_path' from source: unknown 24160 1726853537.60517: calling self._execute() 24160 1726853537.60606: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853537.60611: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853537.60622: variable 'omit' from source: magic vars 24160 1726853537.61780: variable 'ansible_distribution_major_version' from source: facts 24160 1726853537.61790: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853537.62300: variable 'network_provider' from source: set_fact 24160 1726853537.62303: Evaluated conditional (network_provider == "initscripts"): False 24160 1726853537.62306: when evaluation is False, skipping this task 24160 1726853537.62309: _execute() done 24160 1726853537.62311: dumping result to json 24160 1726853537.62314: done dumping result, returning 24160 1726853537.62325: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [02083763-bbaf-5676-4eb4-000000000026] 24160 1726853537.62328: sending task result for task 02083763-bbaf-5676-4eb4-000000000026 24160 1726853537.62427: done sending task result for task 02083763-bbaf-5676-4eb4-000000000026 24160 1726853537.62430: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 24160 1726853537.62489: no more pending results, returning what we have 24160 1726853537.62493: results queue empty 24160 1726853537.62495: checking for any_errors_fatal 24160 1726853537.62504: done checking for any_errors_fatal 24160 1726853537.62505: checking for max_fail_percentage 24160 1726853537.62507: done checking for max_fail_percentage 24160 1726853537.62508: checking to see if all hosts have failed and the running result is not ok 24160 1726853537.62509: done checking to see if all hosts have failed 24160 1726853537.62510: getting the remaining hosts for this loop 24160 1726853537.62512: done getting the remaining hosts for this loop 24160 1726853537.62516: getting the next task for host managed_node1 24160 1726853537.62525: done getting next task for host managed_node1 24160 1726853537.62529: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 24160 1726853537.62532: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853537.62551: getting variables 24160 1726853537.62556: in VariableManager get_vars() 24160 1726853537.62598: Calling all_inventory to load vars for managed_node1 24160 1726853537.62602: Calling groups_inventory to load vars for managed_node1 24160 1726853537.62604: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853537.62616: Calling all_plugins_play to load vars for managed_node1 24160 1726853537.62619: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853537.62623: Calling groups_plugins_play to load vars for managed_node1 24160 1726853537.65622: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853537.68214: done with get_vars() 24160 1726853537.68245: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 13:32:17 -0400 (0:00:00.098) 0:00:14.086 ****** 24160 1726853537.68339: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 24160 1726853537.68341: Creating lock for fedora.linux_system_roles.network_connections 24160 1726853537.68761: worker is 1 (out of 1 available) 24160 1726853537.68776: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 24160 1726853537.68846: done queuing things up, now waiting for results queue to drain 24160 1726853537.68849: waiting for pending results... 24160 1726853537.69418: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 24160 1726853537.69423: in run() - task 02083763-bbaf-5676-4eb4-000000000027 24160 1726853537.69427: variable 'ansible_search_path' from source: unknown 24160 1726853537.69429: variable 'ansible_search_path' from source: unknown 24160 1726853537.69433: calling self._execute() 24160 1726853537.69533: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853537.69545: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853537.69559: variable 'omit' from source: magic vars 24160 1726853537.70485: variable 'ansible_distribution_major_version' from source: facts 24160 1726853537.70489: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853537.70492: variable 'omit' from source: magic vars 24160 1726853537.70494: variable 'omit' from source: magic vars 24160 1726853537.70919: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24160 1726853537.73676: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24160 1726853537.73747: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24160 1726853537.73795: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24160 1726853537.73840: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24160 1726853537.73880: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24160 1726853537.73964: variable 'network_provider' from source: set_fact 24160 1726853537.74109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853537.74149: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853537.74186: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853537.74234: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853537.74297: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853537.74341: variable 'omit' from source: magic vars 24160 1726853537.74472: variable 'omit' from source: magic vars 24160 1726853537.74584: variable 'network_connections' from source: task vars 24160 1726853537.74599: variable 'interface' from source: set_fact 24160 1726853537.74672: variable 'interface' from source: set_fact 24160 1726853537.74732: variable 'interface' from source: set_fact 24160 1726853537.74757: variable 'interface' from source: set_fact 24160 1726853537.74939: variable 'omit' from source: magic vars 24160 1726853537.74957: variable '__lsr_ansible_managed' from source: task vars 24160 1726853537.75027: variable '__lsr_ansible_managed' from source: task vars 24160 1726853537.75306: Loaded config def from plugin (lookup/template) 24160 1726853537.75317: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 24160 1726853537.75384: File lookup term: get_ansible_managed.j2 24160 1726853537.75388: variable 'ansible_search_path' from source: unknown 24160 1726853537.75391: evaluation_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 24160 1726853537.75394: search_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 24160 1726853537.75408: variable 'ansible_search_path' from source: unknown 24160 1726853537.83312: variable 'ansible_managed' from source: unknown 24160 1726853537.83634: variable 'omit' from source: magic vars 24160 1726853537.83637: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24160 1726853537.83639: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24160 1726853537.83755: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24160 1726853537.83779: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853537.83795: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853537.83827: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24160 1726853537.83872: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853537.84066: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853537.84188: Set connection var ansible_shell_executable to /bin/sh 24160 1726853537.84197: Set connection var ansible_pipelining to False 24160 1726853537.84204: Set connection var ansible_connection to ssh 24160 1726853537.84210: Set connection var ansible_shell_type to sh 24160 1726853537.84223: Set connection var ansible_module_compression to ZIP_DEFLATED 24160 1726853537.84238: Set connection var ansible_timeout to 10 24160 1726853537.84264: variable 'ansible_shell_executable' from source: unknown 24160 1726853537.84275: variable 'ansible_connection' from source: unknown 24160 1726853537.84286: variable 'ansible_module_compression' from source: unknown 24160 1726853537.84302: variable 'ansible_shell_type' from source: unknown 24160 1726853537.84476: variable 'ansible_shell_executable' from source: unknown 24160 1726853537.84479: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853537.84481: variable 'ansible_pipelining' from source: unknown 24160 1726853537.84483: variable 'ansible_timeout' from source: unknown 24160 1726853537.84485: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853537.84735: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 24160 1726853537.84746: variable 'omit' from source: magic vars 24160 1726853537.84749: starting attempt loop 24160 1726853537.84751: running the handler 24160 1726853537.84753: _low_level_execute_command(): starting 24160 1726853537.84755: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24160 1726853537.86722: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853537.86740: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853537.87207: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853537.87210: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853537.87263: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853537.87357: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853537.89013: stdout chunk (state=3): >>>/root <<< 24160 1726853537.89288: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853537.89437: stderr chunk (state=3): >>><<< 24160 1726853537.89441: stdout chunk (state=3): >>><<< 24160 1726853537.89444: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853537.89446: _low_level_execute_command(): starting 24160 1726853537.89622: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853537.8941472-24862-113755653425881 `" && echo ansible-tmp-1726853537.8941472-24862-113755653425881="` echo /root/.ansible/tmp/ansible-tmp-1726853537.8941472-24862-113755653425881 `" ) && sleep 0' 24160 1726853537.91228: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853537.91232: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 24160 1726853537.91234: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853537.91236: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853537.91238: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853537.91812: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853537.91888: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853537.93776: stdout chunk (state=3): >>>ansible-tmp-1726853537.8941472-24862-113755653425881=/root/.ansible/tmp/ansible-tmp-1726853537.8941472-24862-113755653425881 <<< 24160 1726853537.93879: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853537.94184: stderr chunk (state=3): >>><<< 24160 1726853537.94187: stdout chunk (state=3): >>><<< 24160 1726853537.94190: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853537.8941472-24862-113755653425881=/root/.ansible/tmp/ansible-tmp-1726853537.8941472-24862-113755653425881 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853537.94192: variable 'ansible_module_compression' from source: unknown 24160 1726853537.94198: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 24160 1726853537.94200: ANSIBALLZ: Acquiring lock 24160 1726853537.94203: ANSIBALLZ: Lock acquired: 140302793559168 24160 1726853537.94290: ANSIBALLZ: Creating module 24160 1726853538.28480: ANSIBALLZ: Writing module into payload 24160 1726853538.28796: ANSIBALLZ: Writing module 24160 1726853538.28819: ANSIBALLZ: Renaming module 24160 1726853538.28825: ANSIBALLZ: Done creating module 24160 1726853538.28858: variable 'ansible_facts' from source: unknown 24160 1726853538.28997: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853537.8941472-24862-113755653425881/AnsiballZ_network_connections.py 24160 1726853538.29235: Sending initial data 24160 1726853538.29238: Sent initial data (168 bytes) 24160 1726853538.29838: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853538.29848: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853538.29860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853538.29978: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24160 1726853538.29983: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 24160 1726853538.29985: stderr chunk (state=3): >>>debug2: match not found <<< 24160 1726853538.29987: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853538.29989: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24160 1726853538.29992: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853538.30076: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853538.30080: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853538.30082: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853538.30124: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853538.31813: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24160 1726853538.32123: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24160 1726853538.32127: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24160jdl187cr/tmp3blq8fn4 /root/.ansible/tmp/ansible-tmp-1726853537.8941472-24862-113755653425881/AnsiballZ_network_connections.py <<< 24160 1726853538.32129: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853537.8941472-24862-113755653425881/AnsiballZ_network_connections.py" <<< 24160 1726853538.32184: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24160jdl187cr/tmp3blq8fn4" to remote "/root/.ansible/tmp/ansible-tmp-1726853537.8941472-24862-113755653425881/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853537.8941472-24862-113755653425881/AnsiballZ_network_connections.py" <<< 24160 1726853538.34102: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853538.34229: stderr chunk (state=3): >>><<< 24160 1726853538.34359: stdout chunk (state=3): >>><<< 24160 1726853538.34363: done transferring module to remote 24160 1726853538.34365: _low_level_execute_command(): starting 24160 1726853538.34368: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853537.8941472-24862-113755653425881/ /root/.ansible/tmp/ansible-tmp-1726853537.8941472-24862-113755653425881/AnsiballZ_network_connections.py && sleep 0' 24160 1726853538.35118: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853538.35156: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853538.35187: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24160 1726853538.35290: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853538.35449: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853538.35531: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853538.37318: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853538.37376: stderr chunk (state=3): >>><<< 24160 1726853538.37391: stdout chunk (state=3): >>><<< 24160 1726853538.37422: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853538.37434: _low_level_execute_command(): starting 24160 1726853538.37443: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853537.8941472-24862-113755653425881/AnsiballZ_network_connections.py && sleep 0' 24160 1726853538.38499: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853538.38618: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853538.38834: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853538.38860: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853538.38986: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853538.65001: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'ethtest0': add connection ethtest0, b1645e58-2740-4ae7-b45a-6a29b04ac1fe\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "type": "ethernet", "ip": {"ipv6_disabled": true}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "type": "ethernet", "ip": {"ipv6_disabled": true}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 24160 1726853538.66750: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 24160 1726853538.66765: stdout chunk (state=3): >>><<< 24160 1726853538.66778: stderr chunk (state=3): >>><<< 24160 1726853538.66809: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'ethtest0': add connection ethtest0, b1645e58-2740-4ae7-b45a-6a29b04ac1fe\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "type": "ethernet", "ip": {"ipv6_disabled": true}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "type": "ethernet", "ip": {"ipv6_disabled": true}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 24160 1726853538.66859: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'ethtest0', 'interface_name': 'ethtest0', 'type': 'ethernet', 'ip': {'ipv6_disabled': True}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853537.8941472-24862-113755653425881/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24160 1726853538.66875: _low_level_execute_command(): starting 24160 1726853538.66880: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853537.8941472-24862-113755653425881/ > /dev/null 2>&1 && sleep 0' 24160 1726853538.67369: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853538.67375: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853538.67378: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 24160 1726853538.67380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853538.67382: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853538.67384: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853538.67437: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853538.67441: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853538.67445: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853538.67490: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853538.69325: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853538.69355: stderr chunk (state=3): >>><<< 24160 1726853538.69358: stdout chunk (state=3): >>><<< 24160 1726853538.69375: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853538.69378: handler run complete 24160 1726853538.69400: attempt loop complete, returning result 24160 1726853538.69402: _execute() done 24160 1726853538.69409: dumping result to json 24160 1726853538.69411: done dumping result, returning 24160 1726853538.69419: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [02083763-bbaf-5676-4eb4-000000000027] 24160 1726853538.69421: sending task result for task 02083763-bbaf-5676-4eb4-000000000027 24160 1726853538.69528: done sending task result for task 02083763-bbaf-5676-4eb4-000000000027 24160 1726853538.69531: WORKER PROCESS EXITING changed: [managed_node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "interface_name": "ethtest0", "ip": { "ipv6_disabled": true }, "name": "ethtest0", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [002] #0, state:None persistent_state:present, 'ethtest0': add connection ethtest0, b1645e58-2740-4ae7-b45a-6a29b04ac1fe 24160 1726853538.69622: no more pending results, returning what we have 24160 1726853538.69625: results queue empty 24160 1726853538.69626: checking for any_errors_fatal 24160 1726853538.69632: done checking for any_errors_fatal 24160 1726853538.69633: checking for max_fail_percentage 24160 1726853538.69635: done checking for max_fail_percentage 24160 1726853538.69635: checking to see if all hosts have failed and the running result is not ok 24160 1726853538.69636: done checking to see if all hosts have failed 24160 1726853538.69637: getting the remaining hosts for this loop 24160 1726853538.69638: done getting the remaining hosts for this loop 24160 1726853538.69642: getting the next task for host managed_node1 24160 1726853538.69647: done getting next task for host managed_node1 24160 1726853538.69651: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 24160 1726853538.69654: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853538.69664: getting variables 24160 1726853538.69665: in VariableManager get_vars() 24160 1726853538.69700: Calling all_inventory to load vars for managed_node1 24160 1726853538.69703: Calling groups_inventory to load vars for managed_node1 24160 1726853538.69706: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853538.69714: Calling all_plugins_play to load vars for managed_node1 24160 1726853538.69717: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853538.69719: Calling groups_plugins_play to load vars for managed_node1 24160 1726853538.70812: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853538.71812: done with get_vars() 24160 1726853538.71828: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 13:32:18 -0400 (0:00:01.035) 0:00:15.121 ****** 24160 1726853538.71906: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 24160 1726853538.71907: Creating lock for fedora.linux_system_roles.network_state 24160 1726853538.72162: worker is 1 (out of 1 available) 24160 1726853538.72178: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 24160 1726853538.72191: done queuing things up, now waiting for results queue to drain 24160 1726853538.72193: waiting for pending results... 24160 1726853538.72600: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 24160 1726853538.72605: in run() - task 02083763-bbaf-5676-4eb4-000000000028 24160 1726853538.72615: variable 'ansible_search_path' from source: unknown 24160 1726853538.72620: variable 'ansible_search_path' from source: unknown 24160 1726853538.72666: calling self._execute() 24160 1726853538.72772: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853538.72827: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853538.72838: variable 'omit' from source: magic vars 24160 1726853538.73429: variable 'ansible_distribution_major_version' from source: facts 24160 1726853538.73442: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853538.73582: variable 'network_state' from source: role '' defaults 24160 1726853538.73591: Evaluated conditional (network_state != {}): False 24160 1726853538.73594: when evaluation is False, skipping this task 24160 1726853538.73597: _execute() done 24160 1726853538.73600: dumping result to json 24160 1726853538.73602: done dumping result, returning 24160 1726853538.73679: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [02083763-bbaf-5676-4eb4-000000000028] 24160 1726853538.73683: sending task result for task 02083763-bbaf-5676-4eb4-000000000028 24160 1726853538.73753: done sending task result for task 02083763-bbaf-5676-4eb4-000000000028 24160 1726853538.73756: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 24160 1726853538.73841: no more pending results, returning what we have 24160 1726853538.73845: results queue empty 24160 1726853538.73846: checking for any_errors_fatal 24160 1726853538.73857: done checking for any_errors_fatal 24160 1726853538.73858: checking for max_fail_percentage 24160 1726853538.73860: done checking for max_fail_percentage 24160 1726853538.73861: checking to see if all hosts have failed and the running result is not ok 24160 1726853538.73861: done checking to see if all hosts have failed 24160 1726853538.73862: getting the remaining hosts for this loop 24160 1726853538.73864: done getting the remaining hosts for this loop 24160 1726853538.74004: getting the next task for host managed_node1 24160 1726853538.74010: done getting next task for host managed_node1 24160 1726853538.74013: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 24160 1726853538.74016: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853538.74030: getting variables 24160 1726853538.74032: in VariableManager get_vars() 24160 1726853538.74063: Calling all_inventory to load vars for managed_node1 24160 1726853538.74065: Calling groups_inventory to load vars for managed_node1 24160 1726853538.74067: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853538.74099: Calling all_plugins_play to load vars for managed_node1 24160 1726853538.74103: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853538.74117: Calling groups_plugins_play to load vars for managed_node1 24160 1726853538.76122: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853538.77900: done with get_vars() 24160 1726853538.77924: done getting variables 24160 1726853538.77989: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 13:32:18 -0400 (0:00:00.061) 0:00:15.182 ****** 24160 1726853538.78024: entering _queue_task() for managed_node1/debug 24160 1726853538.78367: worker is 1 (out of 1 available) 24160 1726853538.78382: exiting _queue_task() for managed_node1/debug 24160 1726853538.78394: done queuing things up, now waiting for results queue to drain 24160 1726853538.78395: waiting for pending results... 24160 1726853538.78796: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 24160 1726853538.78977: in run() - task 02083763-bbaf-5676-4eb4-000000000029 24160 1726853538.78981: variable 'ansible_search_path' from source: unknown 24160 1726853538.78984: variable 'ansible_search_path' from source: unknown 24160 1726853538.78987: calling self._execute() 24160 1726853538.78989: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853538.78992: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853538.78996: variable 'omit' from source: magic vars 24160 1726853538.79357: variable 'ansible_distribution_major_version' from source: facts 24160 1726853538.79374: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853538.79381: variable 'omit' from source: magic vars 24160 1726853538.79441: variable 'omit' from source: magic vars 24160 1726853538.79480: variable 'omit' from source: magic vars 24160 1726853538.79519: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24160 1726853538.79551: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24160 1726853538.79582: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24160 1726853538.79600: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853538.79611: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853538.79640: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24160 1726853538.79644: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853538.79646: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853538.79750: Set connection var ansible_shell_executable to /bin/sh 24160 1726853538.79880: Set connection var ansible_pipelining to False 24160 1726853538.79884: Set connection var ansible_connection to ssh 24160 1726853538.79887: Set connection var ansible_shell_type to sh 24160 1726853538.79890: Set connection var ansible_module_compression to ZIP_DEFLATED 24160 1726853538.79893: Set connection var ansible_timeout to 10 24160 1726853538.79895: variable 'ansible_shell_executable' from source: unknown 24160 1726853538.79898: variable 'ansible_connection' from source: unknown 24160 1726853538.79901: variable 'ansible_module_compression' from source: unknown 24160 1726853538.79903: variable 'ansible_shell_type' from source: unknown 24160 1726853538.79906: variable 'ansible_shell_executable' from source: unknown 24160 1726853538.79909: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853538.79911: variable 'ansible_pipelining' from source: unknown 24160 1726853538.79914: variable 'ansible_timeout' from source: unknown 24160 1726853538.79917: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853538.79987: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 24160 1726853538.79998: variable 'omit' from source: magic vars 24160 1726853538.80003: starting attempt loop 24160 1726853538.80007: running the handler 24160 1726853538.80130: variable '__network_connections_result' from source: set_fact 24160 1726853538.80187: handler run complete 24160 1726853538.80202: attempt loop complete, returning result 24160 1726853538.80205: _execute() done 24160 1726853538.80208: dumping result to json 24160 1726853538.80211: done dumping result, returning 24160 1726853538.80220: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [02083763-bbaf-5676-4eb4-000000000029] 24160 1726853538.80223: sending task result for task 02083763-bbaf-5676-4eb4-000000000029 24160 1726853538.80428: done sending task result for task 02083763-bbaf-5676-4eb4-000000000029 24160 1726853538.80431: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:None persistent_state:present, 'ethtest0': add connection ethtest0, b1645e58-2740-4ae7-b45a-6a29b04ac1fe" ] } 24160 1726853538.80494: no more pending results, returning what we have 24160 1726853538.80497: results queue empty 24160 1726853538.80498: checking for any_errors_fatal 24160 1726853538.80504: done checking for any_errors_fatal 24160 1726853538.80504: checking for max_fail_percentage 24160 1726853538.80506: done checking for max_fail_percentage 24160 1726853538.80506: checking to see if all hosts have failed and the running result is not ok 24160 1726853538.80507: done checking to see if all hosts have failed 24160 1726853538.80508: getting the remaining hosts for this loop 24160 1726853538.80509: done getting the remaining hosts for this loop 24160 1726853538.80512: getting the next task for host managed_node1 24160 1726853538.80518: done getting next task for host managed_node1 24160 1726853538.80521: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 24160 1726853538.80524: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853538.80534: getting variables 24160 1726853538.80535: in VariableManager get_vars() 24160 1726853538.80572: Calling all_inventory to load vars for managed_node1 24160 1726853538.80575: Calling groups_inventory to load vars for managed_node1 24160 1726853538.80577: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853538.80587: Calling all_plugins_play to load vars for managed_node1 24160 1726853538.80589: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853538.80592: Calling groups_plugins_play to load vars for managed_node1 24160 1726853538.82025: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853538.83882: done with get_vars() 24160 1726853538.83907: done getting variables 24160 1726853538.83970: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 13:32:18 -0400 (0:00:00.059) 0:00:15.242 ****** 24160 1726853538.84008: entering _queue_task() for managed_node1/debug 24160 1726853538.84494: worker is 1 (out of 1 available) 24160 1726853538.84504: exiting _queue_task() for managed_node1/debug 24160 1726853538.84514: done queuing things up, now waiting for results queue to drain 24160 1726853538.84515: waiting for pending results... 24160 1726853538.84662: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 24160 1726853538.84847: in run() - task 02083763-bbaf-5676-4eb4-00000000002a 24160 1726853538.84874: variable 'ansible_search_path' from source: unknown 24160 1726853538.84878: variable 'ansible_search_path' from source: unknown 24160 1726853538.84916: calling self._execute() 24160 1726853538.85126: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853538.85133: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853538.85188: variable 'omit' from source: magic vars 24160 1726853538.85539: variable 'ansible_distribution_major_version' from source: facts 24160 1726853538.85558: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853538.85562: variable 'omit' from source: magic vars 24160 1726853538.85619: variable 'omit' from source: magic vars 24160 1726853538.85656: variable 'omit' from source: magic vars 24160 1726853538.85731: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24160 1726853538.85735: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24160 1726853538.85751: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24160 1726853538.85777: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853538.85783: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853538.85812: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24160 1726853538.85816: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853538.85818: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853538.85948: Set connection var ansible_shell_executable to /bin/sh 24160 1726853538.85951: Set connection var ansible_pipelining to False 24160 1726853538.85954: Set connection var ansible_connection to ssh 24160 1726853538.85956: Set connection var ansible_shell_type to sh 24160 1726853538.85958: Set connection var ansible_module_compression to ZIP_DEFLATED 24160 1726853538.85962: Set connection var ansible_timeout to 10 24160 1726853538.85969: variable 'ansible_shell_executable' from source: unknown 24160 1726853538.85985: variable 'ansible_connection' from source: unknown 24160 1726853538.86102: variable 'ansible_module_compression' from source: unknown 24160 1726853538.86106: variable 'ansible_shell_type' from source: unknown 24160 1726853538.86109: variable 'ansible_shell_executable' from source: unknown 24160 1726853538.86111: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853538.86114: variable 'ansible_pipelining' from source: unknown 24160 1726853538.86117: variable 'ansible_timeout' from source: unknown 24160 1726853538.86119: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853538.86139: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 24160 1726853538.86150: variable 'omit' from source: magic vars 24160 1726853538.86156: starting attempt loop 24160 1726853538.86163: running the handler 24160 1726853538.86215: variable '__network_connections_result' from source: set_fact 24160 1726853538.86295: variable '__network_connections_result' from source: set_fact 24160 1726853538.86412: handler run complete 24160 1726853538.86440: attempt loop complete, returning result 24160 1726853538.86444: _execute() done 24160 1726853538.86446: dumping result to json 24160 1726853538.86449: done dumping result, returning 24160 1726853538.86464: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [02083763-bbaf-5676-4eb4-00000000002a] 24160 1726853538.86467: sending task result for task 02083763-bbaf-5676-4eb4-00000000002a 24160 1726853538.86681: done sending task result for task 02083763-bbaf-5676-4eb4-00000000002a 24160 1726853538.86684: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "interface_name": "ethtest0", "ip": { "ipv6_disabled": true }, "name": "ethtest0", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'ethtest0': add connection ethtest0, b1645e58-2740-4ae7-b45a-6a29b04ac1fe\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'ethtest0': add connection ethtest0, b1645e58-2740-4ae7-b45a-6a29b04ac1fe" ] } } 24160 1726853538.86772: no more pending results, returning what we have 24160 1726853538.86776: results queue empty 24160 1726853538.86777: checking for any_errors_fatal 24160 1726853538.86784: done checking for any_errors_fatal 24160 1726853538.86785: checking for max_fail_percentage 24160 1726853538.86787: done checking for max_fail_percentage 24160 1726853538.86787: checking to see if all hosts have failed and the running result is not ok 24160 1726853538.86788: done checking to see if all hosts have failed 24160 1726853538.86789: getting the remaining hosts for this loop 24160 1726853538.86790: done getting the remaining hosts for this loop 24160 1726853538.86794: getting the next task for host managed_node1 24160 1726853538.86800: done getting next task for host managed_node1 24160 1726853538.86804: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 24160 1726853538.86807: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853538.86818: getting variables 24160 1726853538.86820: in VariableManager get_vars() 24160 1726853538.86857: Calling all_inventory to load vars for managed_node1 24160 1726853538.86861: Calling groups_inventory to load vars for managed_node1 24160 1726853538.86863: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853538.87376: Calling all_plugins_play to load vars for managed_node1 24160 1726853538.87381: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853538.87385: Calling groups_plugins_play to load vars for managed_node1 24160 1726853538.89999: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853538.91807: done with get_vars() 24160 1726853538.91838: done getting variables 24160 1726853538.91961: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 13:32:18 -0400 (0:00:00.080) 0:00:15.323 ****** 24160 1726853538.92051: entering _queue_task() for managed_node1/debug 24160 1726853538.92634: worker is 1 (out of 1 available) 24160 1726853538.92887: exiting _queue_task() for managed_node1/debug 24160 1726853538.92899: done queuing things up, now waiting for results queue to drain 24160 1726853538.92901: waiting for pending results... 24160 1726853538.93376: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 24160 1726853538.93809: in run() - task 02083763-bbaf-5676-4eb4-00000000002b 24160 1726853538.93880: variable 'ansible_search_path' from source: unknown 24160 1726853538.93884: variable 'ansible_search_path' from source: unknown 24160 1726853538.93887: calling self._execute() 24160 1726853538.94043: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853538.94054: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853538.94277: variable 'omit' from source: magic vars 24160 1726853538.94637: variable 'ansible_distribution_major_version' from source: facts 24160 1726853538.94651: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853538.94795: variable 'network_state' from source: role '' defaults 24160 1726853538.94805: Evaluated conditional (network_state != {}): False 24160 1726853538.94808: when evaluation is False, skipping this task 24160 1726853538.94811: _execute() done 24160 1726853538.94814: dumping result to json 24160 1726853538.94816: done dumping result, returning 24160 1726853538.94825: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [02083763-bbaf-5676-4eb4-00000000002b] 24160 1726853538.94830: sending task result for task 02083763-bbaf-5676-4eb4-00000000002b 24160 1726853538.94926: done sending task result for task 02083763-bbaf-5676-4eb4-00000000002b 24160 1726853538.94930: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "network_state != {}" } 24160 1726853538.94987: no more pending results, returning what we have 24160 1726853538.94997: results queue empty 24160 1726853538.94999: checking for any_errors_fatal 24160 1726853538.95008: done checking for any_errors_fatal 24160 1726853538.95009: checking for max_fail_percentage 24160 1726853538.95011: done checking for max_fail_percentage 24160 1726853538.95012: checking to see if all hosts have failed and the running result is not ok 24160 1726853538.95012: done checking to see if all hosts have failed 24160 1726853538.95013: getting the remaining hosts for this loop 24160 1726853538.95015: done getting the remaining hosts for this loop 24160 1726853538.95019: getting the next task for host managed_node1 24160 1726853538.95027: done getting next task for host managed_node1 24160 1726853538.95032: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 24160 1726853538.95036: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853538.95052: getting variables 24160 1726853538.95056: in VariableManager get_vars() 24160 1726853538.95175: Calling all_inventory to load vars for managed_node1 24160 1726853538.95179: Calling groups_inventory to load vars for managed_node1 24160 1726853538.95183: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853538.95197: Calling all_plugins_play to load vars for managed_node1 24160 1726853538.95201: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853538.95212: Calling groups_plugins_play to load vars for managed_node1 24160 1726853538.96819: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853538.99098: done with get_vars() 24160 1726853538.99124: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 13:32:18 -0400 (0:00:00.072) 0:00:15.395 ****** 24160 1726853538.99290: entering _queue_task() for managed_node1/ping 24160 1726853538.99292: Creating lock for ping 24160 1726853538.99815: worker is 1 (out of 1 available) 24160 1726853538.99826: exiting _queue_task() for managed_node1/ping 24160 1726853538.99836: done queuing things up, now waiting for results queue to drain 24160 1726853538.99837: waiting for pending results... 24160 1726853539.00390: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 24160 1726853539.00396: in run() - task 02083763-bbaf-5676-4eb4-00000000002c 24160 1726853539.00399: variable 'ansible_search_path' from source: unknown 24160 1726853539.00402: variable 'ansible_search_path' from source: unknown 24160 1726853539.00404: calling self._execute() 24160 1726853539.00576: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853539.00580: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853539.00584: variable 'omit' from source: magic vars 24160 1726853539.00878: variable 'ansible_distribution_major_version' from source: facts 24160 1726853539.00924: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853539.00931: variable 'omit' from source: magic vars 24160 1726853539.01027: variable 'omit' from source: magic vars 24160 1726853539.01069: variable 'omit' from source: magic vars 24160 1726853539.01107: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24160 1726853539.01149: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24160 1726853539.01194: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24160 1726853539.01376: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853539.01379: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853539.01382: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24160 1726853539.01384: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853539.01386: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853539.01388: Set connection var ansible_shell_executable to /bin/sh 24160 1726853539.01390: Set connection var ansible_pipelining to False 24160 1726853539.01392: Set connection var ansible_connection to ssh 24160 1726853539.01394: Set connection var ansible_shell_type to sh 24160 1726853539.01396: Set connection var ansible_module_compression to ZIP_DEFLATED 24160 1726853539.01403: Set connection var ansible_timeout to 10 24160 1726853539.01423: variable 'ansible_shell_executable' from source: unknown 24160 1726853539.01426: variable 'ansible_connection' from source: unknown 24160 1726853539.01429: variable 'ansible_module_compression' from source: unknown 24160 1726853539.01432: variable 'ansible_shell_type' from source: unknown 24160 1726853539.01441: variable 'ansible_shell_executable' from source: unknown 24160 1726853539.01444: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853539.01449: variable 'ansible_pipelining' from source: unknown 24160 1726853539.01452: variable 'ansible_timeout' from source: unknown 24160 1726853539.01458: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853539.01678: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 24160 1726853539.01687: variable 'omit' from source: magic vars 24160 1726853539.01692: starting attempt loop 24160 1726853539.01695: running the handler 24160 1726853539.01713: _low_level_execute_command(): starting 24160 1726853539.01721: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24160 1726853539.02541: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853539.02583: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853539.02607: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853539.02610: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853539.02709: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853539.04524: stdout chunk (state=3): >>>/root <<< 24160 1726853539.04616: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853539.04667: stderr chunk (state=3): >>><<< 24160 1726853539.04721: stdout chunk (state=3): >>><<< 24160 1726853539.04743: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853539.04853: _low_level_execute_command(): starting 24160 1726853539.04859: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853539.0474405-24912-73594953851108 `" && echo ansible-tmp-1726853539.0474405-24912-73594953851108="` echo /root/.ansible/tmp/ansible-tmp-1726853539.0474405-24912-73594953851108 `" ) && sleep 0' 24160 1726853539.05427: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853539.05432: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853539.05435: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853539.05438: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24160 1726853539.05440: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 24160 1726853539.05442: stderr chunk (state=3): >>>debug2: match not found <<< 24160 1726853539.05444: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853539.05458: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24160 1726853539.05461: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address <<< 24160 1726853539.05629: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853539.05633: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853539.05637: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853539.05645: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853539.05703: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853539.07598: stdout chunk (state=3): >>>ansible-tmp-1726853539.0474405-24912-73594953851108=/root/.ansible/tmp/ansible-tmp-1726853539.0474405-24912-73594953851108 <<< 24160 1726853539.07876: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853539.07879: stdout chunk (state=3): >>><<< 24160 1726853539.07882: stderr chunk (state=3): >>><<< 24160 1726853539.07884: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853539.0474405-24912-73594953851108=/root/.ansible/tmp/ansible-tmp-1726853539.0474405-24912-73594953851108 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853539.07886: variable 'ansible_module_compression' from source: unknown 24160 1726853539.07888: ANSIBALLZ: Using lock for ping 24160 1726853539.07890: ANSIBALLZ: Acquiring lock 24160 1726853539.07891: ANSIBALLZ: Lock acquired: 140302793795408 24160 1726853539.07893: ANSIBALLZ: Creating module 24160 1726853539.21130: ANSIBALLZ: Writing module into payload 24160 1726853539.21225: ANSIBALLZ: Writing module 24160 1726853539.21251: ANSIBALLZ: Renaming module 24160 1726853539.21258: ANSIBALLZ: Done creating module 24160 1726853539.21277: variable 'ansible_facts' from source: unknown 24160 1726853539.21362: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853539.0474405-24912-73594953851108/AnsiballZ_ping.py 24160 1726853539.21550: Sending initial data 24160 1726853539.21556: Sent initial data (152 bytes) 24160 1726853539.22445: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853539.22450: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853539.22602: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853539.22817: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853539.24336: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24160 1726853539.24374: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24160 1726853539.24419: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24160jdl187cr/tmpfcvcw593 /root/.ansible/tmp/ansible-tmp-1726853539.0474405-24912-73594953851108/AnsiballZ_ping.py <<< 24160 1726853539.24422: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853539.0474405-24912-73594953851108/AnsiballZ_ping.py" <<< 24160 1726853539.24455: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24160jdl187cr/tmpfcvcw593" to remote "/root/.ansible/tmp/ansible-tmp-1726853539.0474405-24912-73594953851108/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853539.0474405-24912-73594953851108/AnsiballZ_ping.py" <<< 24160 1726853539.25557: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853539.25678: stderr chunk (state=3): >>><<< 24160 1726853539.25682: stdout chunk (state=3): >>><<< 24160 1726853539.25685: done transferring module to remote 24160 1726853539.25688: _low_level_execute_command(): starting 24160 1726853539.25691: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853539.0474405-24912-73594953851108/ /root/.ansible/tmp/ansible-tmp-1726853539.0474405-24912-73594953851108/AnsiballZ_ping.py && sleep 0' 24160 1726853539.26244: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853539.26285: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853539.26348: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853539.26362: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853539.26465: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853539.26531: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853539.28378: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853539.28381: stdout chunk (state=3): >>><<< 24160 1726853539.28390: stderr chunk (state=3): >>><<< 24160 1726853539.28405: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853539.28408: _low_level_execute_command(): starting 24160 1726853539.28411: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853539.0474405-24912-73594953851108/AnsiballZ_ping.py && sleep 0' 24160 1726853539.28964: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853539.28988: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853539.29086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853539.29111: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853539.29128: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853539.29151: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853539.29236: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853539.44380: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 24160 1726853539.45835: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 24160 1726853539.45885: stderr chunk (state=3): >>><<< 24160 1726853539.45897: stdout chunk (state=3): >>><<< 24160 1726853539.46072: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 24160 1726853539.46283: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853539.0474405-24912-73594953851108/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24160 1726853539.46287: _low_level_execute_command(): starting 24160 1726853539.46289: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853539.0474405-24912-73594953851108/ > /dev/null 2>&1 && sleep 0' 24160 1726853539.47427: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853539.47452: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853539.47513: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 24160 1726853539.47616: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853539.47715: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853539.47810: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853539.47915: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853539.50020: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853539.50023: stdout chunk (state=3): >>><<< 24160 1726853539.50025: stderr chunk (state=3): >>><<< 24160 1726853539.50045: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853539.50076: handler run complete 24160 1726853539.50479: attempt loop complete, returning result 24160 1726853539.50483: _execute() done 24160 1726853539.50486: dumping result to json 24160 1726853539.50488: done dumping result, returning 24160 1726853539.50490: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [02083763-bbaf-5676-4eb4-00000000002c] 24160 1726853539.50492: sending task result for task 02083763-bbaf-5676-4eb4-00000000002c 24160 1726853539.50565: done sending task result for task 02083763-bbaf-5676-4eb4-00000000002c 24160 1726853539.50568: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "ping": "pong" } 24160 1726853539.50657: no more pending results, returning what we have 24160 1726853539.50662: results queue empty 24160 1726853539.50663: checking for any_errors_fatal 24160 1726853539.50669: done checking for any_errors_fatal 24160 1726853539.50673: checking for max_fail_percentage 24160 1726853539.50675: done checking for max_fail_percentage 24160 1726853539.50675: checking to see if all hosts have failed and the running result is not ok 24160 1726853539.50676: done checking to see if all hosts have failed 24160 1726853539.50677: getting the remaining hosts for this loop 24160 1726853539.50683: done getting the remaining hosts for this loop 24160 1726853539.50688: getting the next task for host managed_node1 24160 1726853539.50700: done getting next task for host managed_node1 24160 1726853539.50702: ^ task is: TASK: meta (role_complete) 24160 1726853539.50705: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853539.50718: getting variables 24160 1726853539.50719: in VariableManager get_vars() 24160 1726853539.50765: Calling all_inventory to load vars for managed_node1 24160 1726853539.50768: Calling groups_inventory to load vars for managed_node1 24160 1726853539.50770: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853539.51107: Calling all_plugins_play to load vars for managed_node1 24160 1726853539.51110: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853539.51113: Calling groups_plugins_play to load vars for managed_node1 24160 1726853539.53072: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853539.56080: done with get_vars() 24160 1726853539.56115: done getting variables 24160 1726853539.56324: done queuing things up, now waiting for results queue to drain 24160 1726853539.56326: results queue empty 24160 1726853539.56327: checking for any_errors_fatal 24160 1726853539.56329: done checking for any_errors_fatal 24160 1726853539.56332: checking for max_fail_percentage 24160 1726853539.56333: done checking for max_fail_percentage 24160 1726853539.56334: checking to see if all hosts have failed and the running result is not ok 24160 1726853539.56334: done checking to see if all hosts have failed 24160 1726853539.56335: getting the remaining hosts for this loop 24160 1726853539.56336: done getting the remaining hosts for this loop 24160 1726853539.56339: getting the next task for host managed_node1 24160 1726853539.56343: done getting next task for host managed_node1 24160 1726853539.56345: ^ task is: TASK: Assert that configuring `ipv6_disabled` will only fail when the running version of NetworKManager does not support it 24160 1726853539.56346: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853539.56349: getting variables 24160 1726853539.56350: in VariableManager get_vars() 24160 1726853539.56480: Calling all_inventory to load vars for managed_node1 24160 1726853539.56483: Calling groups_inventory to load vars for managed_node1 24160 1726853539.56485: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853539.56490: Calling all_plugins_play to load vars for managed_node1 24160 1726853539.56492: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853539.56495: Calling groups_plugins_play to load vars for managed_node1 24160 1726853539.59562: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853539.62494: done with get_vars() 24160 1726853539.62518: done getting variables 24160 1726853539.62565: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that configuring `ipv6_disabled` will only fail when the running version of NetworKManager does not support it] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:41 Friday 20 September 2024 13:32:19 -0400 (0:00:00.634) 0:00:16.029 ****** 24160 1726853539.62712: entering _queue_task() for managed_node1/assert 24160 1726853539.63494: worker is 1 (out of 1 available) 24160 1726853539.63508: exiting _queue_task() for managed_node1/assert 24160 1726853539.63521: done queuing things up, now waiting for results queue to drain 24160 1726853539.63523: waiting for pending results... 24160 1726853539.64531: running TaskExecutor() for managed_node1/TASK: Assert that configuring `ipv6_disabled` will only fail when the running version of NetworKManager does not support it 24160 1726853539.64540: in run() - task 02083763-bbaf-5676-4eb4-00000000005c 24160 1726853539.64641: variable 'ansible_search_path' from source: unknown 24160 1726853539.64646: calling self._execute() 24160 1726853539.64716: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853539.64722: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853539.64733: variable 'omit' from source: magic vars 24160 1726853539.65128: variable 'ansible_distribution_major_version' from source: facts 24160 1726853539.65141: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853539.65258: variable '__network_connections_result' from source: set_fact 24160 1726853539.65274: Evaluated conditional (__network_connections_result.failed): False 24160 1726853539.65277: when evaluation is False, skipping this task 24160 1726853539.65281: _execute() done 24160 1726853539.65284: dumping result to json 24160 1726853539.65286: done dumping result, returning 24160 1726853539.65303: done running TaskExecutor() for managed_node1/TASK: Assert that configuring `ipv6_disabled` will only fail when the running version of NetworKManager does not support it [02083763-bbaf-5676-4eb4-00000000005c] 24160 1726853539.65306: sending task result for task 02083763-bbaf-5676-4eb4-00000000005c 24160 1726853539.65658: done sending task result for task 02083763-bbaf-5676-4eb4-00000000005c 24160 1726853539.65662: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_connections_result.failed", "skip_reason": "Conditional result was False" } 24160 1726853539.65704: no more pending results, returning what we have 24160 1726853539.65707: results queue empty 24160 1726853539.65708: checking for any_errors_fatal 24160 1726853539.65710: done checking for any_errors_fatal 24160 1726853539.65711: checking for max_fail_percentage 24160 1726853539.65713: done checking for max_fail_percentage 24160 1726853539.65713: checking to see if all hosts have failed and the running result is not ok 24160 1726853539.65714: done checking to see if all hosts have failed 24160 1726853539.65715: getting the remaining hosts for this loop 24160 1726853539.65716: done getting the remaining hosts for this loop 24160 1726853539.65720: getting the next task for host managed_node1 24160 1726853539.65725: done getting next task for host managed_node1 24160 1726853539.65727: ^ task is: TASK: Verify nmcli connection ipv6.method 24160 1726853539.65730: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853539.65733: getting variables 24160 1726853539.65735: in VariableManager get_vars() 24160 1726853539.65776: Calling all_inventory to load vars for managed_node1 24160 1726853539.65779: Calling groups_inventory to load vars for managed_node1 24160 1726853539.65781: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853539.65790: Calling all_plugins_play to load vars for managed_node1 24160 1726853539.65793: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853539.65795: Calling groups_plugins_play to load vars for managed_node1 24160 1726853539.67405: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853539.69194: done with get_vars() 24160 1726853539.69215: done getting variables 24160 1726853539.69309: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Verify nmcli connection ipv6.method] ************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:48 Friday 20 September 2024 13:32:19 -0400 (0:00:00.066) 0:00:16.096 ****** 24160 1726853539.69334: entering _queue_task() for managed_node1/shell 24160 1726853539.69336: Creating lock for shell 24160 1726853539.69682: worker is 1 (out of 1 available) 24160 1726853539.69695: exiting _queue_task() for managed_node1/shell 24160 1726853539.69709: done queuing things up, now waiting for results queue to drain 24160 1726853539.69711: waiting for pending results... 24160 1726853539.70018: running TaskExecutor() for managed_node1/TASK: Verify nmcli connection ipv6.method 24160 1726853539.70114: in run() - task 02083763-bbaf-5676-4eb4-00000000005d 24160 1726853539.70126: variable 'ansible_search_path' from source: unknown 24160 1726853539.70158: calling self._execute() 24160 1726853539.70238: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853539.70241: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853539.70250: variable 'omit' from source: magic vars 24160 1726853539.70580: variable 'ansible_distribution_major_version' from source: facts 24160 1726853539.70584: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853539.70708: variable '__network_connections_result' from source: set_fact 24160 1726853539.70739: Evaluated conditional (not __network_connections_result.failed): True 24160 1726853539.70762: variable 'omit' from source: magic vars 24160 1726853539.70765: variable 'omit' from source: magic vars 24160 1726853539.70834: variable 'interface' from source: set_fact 24160 1726853539.70886: variable 'omit' from source: magic vars 24160 1726853539.70909: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24160 1726853539.70935: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24160 1726853539.70952: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24160 1726853539.70968: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853539.70979: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853539.71016: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24160 1726853539.71019: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853539.71021: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853539.71159: Set connection var ansible_shell_executable to /bin/sh 24160 1726853539.71163: Set connection var ansible_pipelining to False 24160 1726853539.71167: Set connection var ansible_connection to ssh 24160 1726853539.71174: Set connection var ansible_shell_type to sh 24160 1726853539.71176: Set connection var ansible_module_compression to ZIP_DEFLATED 24160 1726853539.71179: Set connection var ansible_timeout to 10 24160 1726853539.71181: variable 'ansible_shell_executable' from source: unknown 24160 1726853539.71183: variable 'ansible_connection' from source: unknown 24160 1726853539.71190: variable 'ansible_module_compression' from source: unknown 24160 1726853539.71193: variable 'ansible_shell_type' from source: unknown 24160 1726853539.71195: variable 'ansible_shell_executable' from source: unknown 24160 1726853539.71197: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853539.71199: variable 'ansible_pipelining' from source: unknown 24160 1726853539.71201: variable 'ansible_timeout' from source: unknown 24160 1726853539.71204: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853539.71350: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 24160 1726853539.71388: variable 'omit' from source: magic vars 24160 1726853539.71446: starting attempt loop 24160 1726853539.71449: running the handler 24160 1726853539.71500: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 24160 1726853539.71503: _low_level_execute_command(): starting 24160 1726853539.71505: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24160 1726853539.72215: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853539.72219: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853539.72221: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853539.72246: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853539.72326: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853539.74010: stdout chunk (state=3): >>>/root <<< 24160 1726853539.74099: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853539.74134: stderr chunk (state=3): >>><<< 24160 1726853539.74140: stdout chunk (state=3): >>><<< 24160 1726853539.74163: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853539.74180: _low_level_execute_command(): starting 24160 1726853539.74187: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853539.7416792-24955-221589561206685 `" && echo ansible-tmp-1726853539.7416792-24955-221589561206685="` echo /root/.ansible/tmp/ansible-tmp-1726853539.7416792-24955-221589561206685 `" ) && sleep 0' 24160 1726853539.74577: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853539.74611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853539.74649: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853539.74663: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853539.74709: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853539.76595: stdout chunk (state=3): >>>ansible-tmp-1726853539.7416792-24955-221589561206685=/root/.ansible/tmp/ansible-tmp-1726853539.7416792-24955-221589561206685 <<< 24160 1726853539.76707: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853539.76728: stderr chunk (state=3): >>><<< 24160 1726853539.76731: stdout chunk (state=3): >>><<< 24160 1726853539.76746: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853539.7416792-24955-221589561206685=/root/.ansible/tmp/ansible-tmp-1726853539.7416792-24955-221589561206685 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853539.76775: variable 'ansible_module_compression' from source: unknown 24160 1726853539.76820: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24160jdl187cr/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 24160 1726853539.76847: variable 'ansible_facts' from source: unknown 24160 1726853539.76906: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853539.7416792-24955-221589561206685/AnsiballZ_command.py 24160 1726853539.77003: Sending initial data 24160 1726853539.77007: Sent initial data (156 bytes) 24160 1726853539.77422: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853539.77425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 24160 1726853539.77427: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address <<< 24160 1726853539.77429: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24160 1726853539.77434: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853539.77483: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853539.77486: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853539.77528: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853539.79034: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 24160 1726853539.79043: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24160 1726853539.79075: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24160 1726853539.79112: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24160jdl187cr/tmpfwjf7s0h /root/.ansible/tmp/ansible-tmp-1726853539.7416792-24955-221589561206685/AnsiballZ_command.py <<< 24160 1726853539.79116: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853539.7416792-24955-221589561206685/AnsiballZ_command.py" <<< 24160 1726853539.79152: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24160jdl187cr/tmpfwjf7s0h" to remote "/root/.ansible/tmp/ansible-tmp-1726853539.7416792-24955-221589561206685/AnsiballZ_command.py" <<< 24160 1726853539.79157: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853539.7416792-24955-221589561206685/AnsiballZ_command.py" <<< 24160 1726853539.79664: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853539.79702: stderr chunk (state=3): >>><<< 24160 1726853539.79705: stdout chunk (state=3): >>><<< 24160 1726853539.79744: done transferring module to remote 24160 1726853539.79751: _low_level_execute_command(): starting 24160 1726853539.79755: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853539.7416792-24955-221589561206685/ /root/.ansible/tmp/ansible-tmp-1726853539.7416792-24955-221589561206685/AnsiballZ_command.py && sleep 0' 24160 1726853539.80144: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853539.80183: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 24160 1726853539.80186: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853539.80191: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853539.80194: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853539.80233: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853539.80236: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853539.80283: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853539.81994: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853539.82020: stderr chunk (state=3): >>><<< 24160 1726853539.82023: stdout chunk (state=3): >>><<< 24160 1726853539.82035: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853539.82038: _low_level_execute_command(): starting 24160 1726853539.82042: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853539.7416792-24955-221589561206685/AnsiballZ_command.py && sleep 0' 24160 1726853539.82433: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853539.82438: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853539.82459: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853539.82505: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853539.82509: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853539.82560: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853539.99413: stdout chunk (state=3): >>> {"changed": true, "stdout": "ipv6.method: disabled", "stderr": "+ nmcli connection show ethtest0\n+ grep ipv6.method", "rc": 0, "cmd": "set -euxo pipefail\nnmcli connection show ethtest0 | grep ipv6.method\n", "start": "2024-09-20 13:32:19.975669", "end": "2024-09-20 13:32:19.993183", "delta": "0:00:00.017514", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nnmcli connection show ethtest0 | grep ipv6.method\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 24160 1726853540.00986: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 24160 1726853540.01015: stderr chunk (state=3): >>><<< 24160 1726853540.01018: stdout chunk (state=3): >>><<< 24160 1726853540.01041: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "ipv6.method: disabled", "stderr": "+ nmcli connection show ethtest0\n+ grep ipv6.method", "rc": 0, "cmd": "set -euxo pipefail\nnmcli connection show ethtest0 | grep ipv6.method\n", "start": "2024-09-20 13:32:19.975669", "end": "2024-09-20 13:32:19.993183", "delta": "0:00:00.017514", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nnmcli connection show ethtest0 | grep ipv6.method\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 24160 1726853540.01074: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nnmcli connection show ethtest0 | grep ipv6.method\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853539.7416792-24955-221589561206685/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24160 1726853540.01081: _low_level_execute_command(): starting 24160 1726853540.01086: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853539.7416792-24955-221589561206685/ > /dev/null 2>&1 && sleep 0' 24160 1726853540.01616: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853540.01619: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 24160 1726853540.01622: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration <<< 24160 1726853540.01624: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 24160 1726853540.01626: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853540.01692: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853540.01695: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853540.01701: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853540.01746: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853540.03577: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853540.03608: stderr chunk (state=3): >>><<< 24160 1726853540.03611: stdout chunk (state=3): >>><<< 24160 1726853540.03625: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853540.03631: handler run complete 24160 1726853540.03647: Evaluated conditional (False): False 24160 1726853540.03655: attempt loop complete, returning result 24160 1726853540.03661: _execute() done 24160 1726853540.03663: dumping result to json 24160 1726853540.03669: done dumping result, returning 24160 1726853540.03678: done running TaskExecutor() for managed_node1/TASK: Verify nmcli connection ipv6.method [02083763-bbaf-5676-4eb4-00000000005d] 24160 1726853540.03698: sending task result for task 02083763-bbaf-5676-4eb4-00000000005d 24160 1726853540.03802: done sending task result for task 02083763-bbaf-5676-4eb4-00000000005d 24160 1726853540.03805: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": "set -euxo pipefail\nnmcli connection show ethtest0 | grep ipv6.method\n", "delta": "0:00:00.017514", "end": "2024-09-20 13:32:19.993183", "rc": 0, "start": "2024-09-20 13:32:19.975669" } STDOUT: ipv6.method: disabled STDERR: + nmcli connection show ethtest0 + grep ipv6.method 24160 1726853540.03899: no more pending results, returning what we have 24160 1726853540.03903: results queue empty 24160 1726853540.03905: checking for any_errors_fatal 24160 1726853540.03913: done checking for any_errors_fatal 24160 1726853540.03914: checking for max_fail_percentage 24160 1726853540.03916: done checking for max_fail_percentage 24160 1726853540.03917: checking to see if all hosts have failed and the running result is not ok 24160 1726853540.03917: done checking to see if all hosts have failed 24160 1726853540.03918: getting the remaining hosts for this loop 24160 1726853540.03919: done getting the remaining hosts for this loop 24160 1726853540.03922: getting the next task for host managed_node1 24160 1726853540.03928: done getting next task for host managed_node1 24160 1726853540.03931: ^ task is: TASK: Assert that ipv6.method disabled is configured correctly 24160 1726853540.03933: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853540.03936: getting variables 24160 1726853540.03937: in VariableManager get_vars() 24160 1726853540.03977: Calling all_inventory to load vars for managed_node1 24160 1726853540.03980: Calling groups_inventory to load vars for managed_node1 24160 1726853540.04004: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853540.04016: Calling all_plugins_play to load vars for managed_node1 24160 1726853540.04019: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853540.04021: Calling groups_plugins_play to load vars for managed_node1 24160 1726853540.05021: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853540.06133: done with get_vars() 24160 1726853540.06155: done getting variables 24160 1726853540.06213: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that ipv6.method disabled is configured correctly] **************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:57 Friday 20 September 2024 13:32:20 -0400 (0:00:00.369) 0:00:16.465 ****** 24160 1726853540.06235: entering _queue_task() for managed_node1/assert 24160 1726853540.06490: worker is 1 (out of 1 available) 24160 1726853540.06505: exiting _queue_task() for managed_node1/assert 24160 1726853540.06517: done queuing things up, now waiting for results queue to drain 24160 1726853540.06519: waiting for pending results... 24160 1726853540.06737: running TaskExecutor() for managed_node1/TASK: Assert that ipv6.method disabled is configured correctly 24160 1726853540.06802: in run() - task 02083763-bbaf-5676-4eb4-00000000005e 24160 1726853540.06814: variable 'ansible_search_path' from source: unknown 24160 1726853540.06844: calling self._execute() 24160 1726853540.06936: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853540.06940: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853540.06942: variable 'omit' from source: magic vars 24160 1726853540.07257: variable 'ansible_distribution_major_version' from source: facts 24160 1726853540.07264: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853540.07381: variable '__network_connections_result' from source: set_fact 24160 1726853540.07391: Evaluated conditional (not __network_connections_result.failed): True 24160 1726853540.07415: variable 'omit' from source: magic vars 24160 1726853540.07418: variable 'omit' from source: magic vars 24160 1726853540.07446: variable 'omit' from source: magic vars 24160 1726853540.07480: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24160 1726853540.07531: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24160 1726853540.07550: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24160 1726853540.07565: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853540.07581: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853540.07604: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24160 1726853540.07606: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853540.07620: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853540.07696: Set connection var ansible_shell_executable to /bin/sh 24160 1726853540.07699: Set connection var ansible_pipelining to False 24160 1726853540.07702: Set connection var ansible_connection to ssh 24160 1726853540.07705: Set connection var ansible_shell_type to sh 24160 1726853540.07712: Set connection var ansible_module_compression to ZIP_DEFLATED 24160 1726853540.07727: Set connection var ansible_timeout to 10 24160 1726853540.07767: variable 'ansible_shell_executable' from source: unknown 24160 1726853540.07772: variable 'ansible_connection' from source: unknown 24160 1726853540.07775: variable 'ansible_module_compression' from source: unknown 24160 1726853540.07778: variable 'ansible_shell_type' from source: unknown 24160 1726853540.07780: variable 'ansible_shell_executable' from source: unknown 24160 1726853540.07782: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853540.07784: variable 'ansible_pipelining' from source: unknown 24160 1726853540.07786: variable 'ansible_timeout' from source: unknown 24160 1726853540.07788: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853540.07903: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 24160 1726853540.07911: variable 'omit' from source: magic vars 24160 1726853540.07916: starting attempt loop 24160 1726853540.07919: running the handler 24160 1726853540.08012: variable 'ipv6_method' from source: set_fact 24160 1726853540.08020: Evaluated conditional ('disabled' in ipv6_method.stdout): True 24160 1726853540.08025: handler run complete 24160 1726853540.08035: attempt loop complete, returning result 24160 1726853540.08038: _execute() done 24160 1726853540.08041: dumping result to json 24160 1726853540.08043: done dumping result, returning 24160 1726853540.08049: done running TaskExecutor() for managed_node1/TASK: Assert that ipv6.method disabled is configured correctly [02083763-bbaf-5676-4eb4-00000000005e] 24160 1726853540.08052: sending task result for task 02083763-bbaf-5676-4eb4-00000000005e 24160 1726853540.08139: done sending task result for task 02083763-bbaf-5676-4eb4-00000000005e 24160 1726853540.08142: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 24160 1726853540.08224: no more pending results, returning what we have 24160 1726853540.08227: results queue empty 24160 1726853540.08228: checking for any_errors_fatal 24160 1726853540.08236: done checking for any_errors_fatal 24160 1726853540.08236: checking for max_fail_percentage 24160 1726853540.08238: done checking for max_fail_percentage 24160 1726853540.08239: checking to see if all hosts have failed and the running result is not ok 24160 1726853540.08239: done checking to see if all hosts have failed 24160 1726853540.08240: getting the remaining hosts for this loop 24160 1726853540.08241: done getting the remaining hosts for this loop 24160 1726853540.08244: getting the next task for host managed_node1 24160 1726853540.08249: done getting next task for host managed_node1 24160 1726853540.08260: ^ task is: TASK: Set the connection_failed flag 24160 1726853540.08262: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853540.08265: getting variables 24160 1726853540.08267: in VariableManager get_vars() 24160 1726853540.08297: Calling all_inventory to load vars for managed_node1 24160 1726853540.08299: Calling groups_inventory to load vars for managed_node1 24160 1726853540.08301: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853540.08310: Calling all_plugins_play to load vars for managed_node1 24160 1726853540.08312: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853540.08315: Calling groups_plugins_play to load vars for managed_node1 24160 1726853540.09403: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853540.10449: done with get_vars() 24160 1726853540.10464: done getting variables 24160 1726853540.10508: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set the connection_failed flag] ****************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:64 Friday 20 September 2024 13:32:20 -0400 (0:00:00.042) 0:00:16.508 ****** 24160 1726853540.10531: entering _queue_task() for managed_node1/set_fact 24160 1726853540.10758: worker is 1 (out of 1 available) 24160 1726853540.10777: exiting _queue_task() for managed_node1/set_fact 24160 1726853540.10789: done queuing things up, now waiting for results queue to drain 24160 1726853540.10791: waiting for pending results... 24160 1726853540.11056: running TaskExecutor() for managed_node1/TASK: Set the connection_failed flag 24160 1726853540.11113: in run() - task 02083763-bbaf-5676-4eb4-00000000005f 24160 1726853540.11131: variable 'ansible_search_path' from source: unknown 24160 1726853540.11156: calling self._execute() 24160 1726853540.11269: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853540.11275: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853540.11293: variable 'omit' from source: magic vars 24160 1726853540.11558: variable 'ansible_distribution_major_version' from source: facts 24160 1726853540.11570: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853540.11648: variable '__network_connections_result' from source: set_fact 24160 1726853540.11664: Evaluated conditional (__network_connections_result.failed): False 24160 1726853540.11667: when evaluation is False, skipping this task 24160 1726853540.11672: _execute() done 24160 1726853540.11675: dumping result to json 24160 1726853540.11677: done dumping result, returning 24160 1726853540.11682: done running TaskExecutor() for managed_node1/TASK: Set the connection_failed flag [02083763-bbaf-5676-4eb4-00000000005f] 24160 1726853540.11687: sending task result for task 02083763-bbaf-5676-4eb4-00000000005f 24160 1726853540.11762: done sending task result for task 02083763-bbaf-5676-4eb4-00000000005f 24160 1726853540.11764: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_connections_result.failed", "skip_reason": "Conditional result was False" } 24160 1726853540.11842: no more pending results, returning what we have 24160 1726853540.11845: results queue empty 24160 1726853540.11846: checking for any_errors_fatal 24160 1726853540.11852: done checking for any_errors_fatal 24160 1726853540.11852: checking for max_fail_percentage 24160 1726853540.11855: done checking for max_fail_percentage 24160 1726853540.11856: checking to see if all hosts have failed and the running result is not ok 24160 1726853540.11857: done checking to see if all hosts have failed 24160 1726853540.11857: getting the remaining hosts for this loop 24160 1726853540.11859: done getting the remaining hosts for this loop 24160 1726853540.11862: getting the next task for host managed_node1 24160 1726853540.11872: done getting next task for host managed_node1 24160 1726853540.11874: ^ task is: TASK: meta (flush_handlers) 24160 1726853540.11879: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853540.11883: getting variables 24160 1726853540.11884: in VariableManager get_vars() 24160 1726853540.11915: Calling all_inventory to load vars for managed_node1 24160 1726853540.11917: Calling groups_inventory to load vars for managed_node1 24160 1726853540.11919: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853540.11928: Calling all_plugins_play to load vars for managed_node1 24160 1726853540.11931: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853540.11933: Calling groups_plugins_play to load vars for managed_node1 24160 1726853540.18538: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853540.20026: done with get_vars() 24160 1726853540.20061: done getting variables 24160 1726853540.20116: in VariableManager get_vars() 24160 1726853540.20128: Calling all_inventory to load vars for managed_node1 24160 1726853540.20131: Calling groups_inventory to load vars for managed_node1 24160 1726853540.20133: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853540.20138: Calling all_plugins_play to load vars for managed_node1 24160 1726853540.20140: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853540.20143: Calling groups_plugins_play to load vars for managed_node1 24160 1726853540.21896: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853540.23416: done with get_vars() 24160 1726853540.23443: done queuing things up, now waiting for results queue to drain 24160 1726853540.23446: results queue empty 24160 1726853540.23447: checking for any_errors_fatal 24160 1726853540.23449: done checking for any_errors_fatal 24160 1726853540.23450: checking for max_fail_percentage 24160 1726853540.23451: done checking for max_fail_percentage 24160 1726853540.23452: checking to see if all hosts have failed and the running result is not ok 24160 1726853540.23453: done checking to see if all hosts have failed 24160 1726853540.23454: getting the remaining hosts for this loop 24160 1726853540.23454: done getting the remaining hosts for this loop 24160 1726853540.23457: getting the next task for host managed_node1 24160 1726853540.23461: done getting next task for host managed_node1 24160 1726853540.23463: ^ task is: TASK: meta (flush_handlers) 24160 1726853540.23464: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853540.23466: getting variables 24160 1726853540.23467: in VariableManager get_vars() 24160 1726853540.23487: Calling all_inventory to load vars for managed_node1 24160 1726853540.23490: Calling groups_inventory to load vars for managed_node1 24160 1726853540.23492: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853540.23498: Calling all_plugins_play to load vars for managed_node1 24160 1726853540.23500: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853540.23503: Calling groups_plugins_play to load vars for managed_node1 24160 1726853540.24938: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853540.26808: done with get_vars() 24160 1726853540.26829: done getting variables 24160 1726853540.26893: in VariableManager get_vars() 24160 1726853540.26905: Calling all_inventory to load vars for managed_node1 24160 1726853540.26907: Calling groups_inventory to load vars for managed_node1 24160 1726853540.26909: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853540.26914: Calling all_plugins_play to load vars for managed_node1 24160 1726853540.26916: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853540.26919: Calling groups_plugins_play to load vars for managed_node1 24160 1726853540.28232: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853540.30125: done with get_vars() 24160 1726853540.30149: done queuing things up, now waiting for results queue to drain 24160 1726853540.30151: results queue empty 24160 1726853540.30152: checking for any_errors_fatal 24160 1726853540.30156: done checking for any_errors_fatal 24160 1726853540.30157: checking for max_fail_percentage 24160 1726853540.30158: done checking for max_fail_percentage 24160 1726853540.30159: checking to see if all hosts have failed and the running result is not ok 24160 1726853540.30159: done checking to see if all hosts have failed 24160 1726853540.30160: getting the remaining hosts for this loop 24160 1726853540.30161: done getting the remaining hosts for this loop 24160 1726853540.30164: getting the next task for host managed_node1 24160 1726853540.30174: done getting next task for host managed_node1 24160 1726853540.30175: ^ task is: None 24160 1726853540.30177: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853540.30178: done queuing things up, now waiting for results queue to drain 24160 1726853540.30178: results queue empty 24160 1726853540.30179: checking for any_errors_fatal 24160 1726853540.30180: done checking for any_errors_fatal 24160 1726853540.30180: checking for max_fail_percentage 24160 1726853540.30181: done checking for max_fail_percentage 24160 1726853540.30182: checking to see if all hosts have failed and the running result is not ok 24160 1726853540.30182: done checking to see if all hosts have failed 24160 1726853540.30184: getting the next task for host managed_node1 24160 1726853540.30186: done getting next task for host managed_node1 24160 1726853540.30187: ^ task is: None 24160 1726853540.30188: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853540.30342: in VariableManager get_vars() 24160 1726853540.30362: done with get_vars() 24160 1726853540.30369: in VariableManager get_vars() 24160 1726853540.30384: done with get_vars() 24160 1726853540.30388: variable 'omit' from source: magic vars 24160 1726853540.30488: variable 'profile' from source: play vars 24160 1726853540.30903: in VariableManager get_vars() 24160 1726853540.30917: done with get_vars() 24160 1726853540.30937: variable 'omit' from source: magic vars 24160 1726853540.31120: variable 'profile' from source: play vars PLAY [Set down {{ profile }}] ************************************************** 24160 1726853540.32473: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 24160 1726853540.32644: getting the remaining hosts for this loop 24160 1726853540.32646: done getting the remaining hosts for this loop 24160 1726853540.32649: getting the next task for host managed_node1 24160 1726853540.32726: done getting next task for host managed_node1 24160 1726853540.32729: ^ task is: TASK: Gathering Facts 24160 1726853540.32731: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853540.32733: getting variables 24160 1726853540.32734: in VariableManager get_vars() 24160 1726853540.32865: Calling all_inventory to load vars for managed_node1 24160 1726853540.32867: Calling groups_inventory to load vars for managed_node1 24160 1726853540.32869: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853540.32877: Calling all_plugins_play to load vars for managed_node1 24160 1726853540.32879: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853540.32968: Calling groups_plugins_play to load vars for managed_node1 24160 1726853540.35451: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853540.38932: done with get_vars() 24160 1726853540.38961: done getting variables 24160 1726853540.39127: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 Friday 20 September 2024 13:32:20 -0400 (0:00:00.286) 0:00:16.794 ****** 24160 1726853540.39153: entering _queue_task() for managed_node1/gather_facts 24160 1726853540.39890: worker is 1 (out of 1 available) 24160 1726853540.39901: exiting _queue_task() for managed_node1/gather_facts 24160 1726853540.39912: done queuing things up, now waiting for results queue to drain 24160 1726853540.40029: waiting for pending results... 24160 1726853540.40405: running TaskExecutor() for managed_node1/TASK: Gathering Facts 24160 1726853540.40525: in run() - task 02083763-bbaf-5676-4eb4-000000000454 24160 1726853540.40547: variable 'ansible_search_path' from source: unknown 24160 1726853540.40650: calling self._execute() 24160 1726853540.40865: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853540.40879: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853540.40893: variable 'omit' from source: magic vars 24160 1726853540.41695: variable 'ansible_distribution_major_version' from source: facts 24160 1726853540.41976: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853540.41980: variable 'omit' from source: magic vars 24160 1726853540.41982: variable 'omit' from source: magic vars 24160 1726853540.41984: variable 'omit' from source: magic vars 24160 1726853540.41986: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24160 1726853540.42238: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24160 1726853540.42242: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24160 1726853540.42245: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853540.42247: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853540.42249: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24160 1726853540.42251: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853540.42253: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853540.42428: Set connection var ansible_shell_executable to /bin/sh 24160 1726853540.42776: Set connection var ansible_pipelining to False 24160 1726853540.42779: Set connection var ansible_connection to ssh 24160 1726853540.42782: Set connection var ansible_shell_type to sh 24160 1726853540.42784: Set connection var ansible_module_compression to ZIP_DEFLATED 24160 1726853540.42786: Set connection var ansible_timeout to 10 24160 1726853540.42788: variable 'ansible_shell_executable' from source: unknown 24160 1726853540.42790: variable 'ansible_connection' from source: unknown 24160 1726853540.42792: variable 'ansible_module_compression' from source: unknown 24160 1726853540.42794: variable 'ansible_shell_type' from source: unknown 24160 1726853540.42796: variable 'ansible_shell_executable' from source: unknown 24160 1726853540.42797: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853540.42799: variable 'ansible_pipelining' from source: unknown 24160 1726853540.42801: variable 'ansible_timeout' from source: unknown 24160 1726853540.42803: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853540.43070: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 24160 1726853540.43088: variable 'omit' from source: magic vars 24160 1726853540.43100: starting attempt loop 24160 1726853540.43107: running the handler 24160 1726853540.43127: variable 'ansible_facts' from source: unknown 24160 1726853540.43157: _low_level_execute_command(): starting 24160 1726853540.43170: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24160 1726853540.44908: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853540.44912: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853540.45005: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853540.46712: stdout chunk (state=3): >>>/root <<< 24160 1726853540.46847: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853540.46865: stdout chunk (state=3): >>><<< 24160 1726853540.46882: stderr chunk (state=3): >>><<< 24160 1726853540.46911: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853540.46933: _low_level_execute_command(): starting 24160 1726853540.46944: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853540.4691887-24976-60588071047971 `" && echo ansible-tmp-1726853540.4691887-24976-60588071047971="` echo /root/.ansible/tmp/ansible-tmp-1726853540.4691887-24976-60588071047971 `" ) && sleep 0' 24160 1726853540.47591: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853540.47676: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853540.47688: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853540.47691: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853540.47878: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853540.49712: stdout chunk (state=3): >>>ansible-tmp-1726853540.4691887-24976-60588071047971=/root/.ansible/tmp/ansible-tmp-1726853540.4691887-24976-60588071047971 <<< 24160 1726853540.49820: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853540.49849: stderr chunk (state=3): >>><<< 24160 1726853540.49851: stdout chunk (state=3): >>><<< 24160 1726853540.49867: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853540.4691887-24976-60588071047971=/root/.ansible/tmp/ansible-tmp-1726853540.4691887-24976-60588071047971 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853540.49914: variable 'ansible_module_compression' from source: unknown 24160 1726853540.49947: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24160jdl187cr/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 24160 1726853540.50004: variable 'ansible_facts' from source: unknown 24160 1726853540.50137: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853540.4691887-24976-60588071047971/AnsiballZ_setup.py 24160 1726853540.50296: Sending initial data 24160 1726853540.50299: Sent initial data (153 bytes) 24160 1726853540.50915: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853540.50943: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853540.51015: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853540.52539: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24160 1726853540.52592: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24160 1726853540.52647: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24160jdl187cr/tmpk8cyppux /root/.ansible/tmp/ansible-tmp-1726853540.4691887-24976-60588071047971/AnsiballZ_setup.py <<< 24160 1726853540.52649: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853540.4691887-24976-60588071047971/AnsiballZ_setup.py" <<< 24160 1726853540.52683: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24160jdl187cr/tmpk8cyppux" to remote "/root/.ansible/tmp/ansible-tmp-1726853540.4691887-24976-60588071047971/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853540.4691887-24976-60588071047971/AnsiballZ_setup.py" <<< 24160 1726853540.53747: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853540.53778: stderr chunk (state=3): >>><<< 24160 1726853540.53782: stdout chunk (state=3): >>><<< 24160 1726853540.53798: done transferring module to remote 24160 1726853540.53807: _low_level_execute_command(): starting 24160 1726853540.53813: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853540.4691887-24976-60588071047971/ /root/.ansible/tmp/ansible-tmp-1726853540.4691887-24976-60588071047971/AnsiballZ_setup.py && sleep 0' 24160 1726853540.54248: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853540.54252: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 24160 1726853540.54256: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 24160 1726853540.54259: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24160 1726853540.54261: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853540.54311: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853540.54316: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853540.54355: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853540.56102: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853540.56138: stderr chunk (state=3): >>><<< 24160 1726853540.56146: stdout chunk (state=3): >>><<< 24160 1726853540.56170: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853540.56176: _low_level_execute_command(): starting 24160 1726853540.56190: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853540.4691887-24976-60588071047971/AnsiballZ_setup.py && sleep 0' 24160 1726853540.56670: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853540.56679: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 24160 1726853540.56681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address <<< 24160 1726853540.56683: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853540.56685: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853540.56732: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853540.56739: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853540.56785: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853541.21697: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_is_chroot": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "32", "second": "20", "epoch": "1726853540", "epoch_int": "1726853540", "date": "2024-09-20", "time": "13:32:20", "iso8601_micro": "2024-09-20T17:32:20.832360Z", "iso8601": "2024-09-20T17:32:20Z", "iso8601_basic": "20240920T133220832360", "iso8601_basic_short": "20240920T133220", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_lsb": {}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-153", "ansible_nodename": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26b9e88796a7cb9ebdc2656ce384f6", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCXYnrsBaUY4i0/t1QUWoZkFPnFbGncbkmF01/zUZNuwldCwqYoDxpR2K8ARraEuK9oVLyYO0alCszdGP42db6R4xfRCOhN3faeZXsneupsJk4LLpIBkq0uIokeAtcL1tPOUQQzfsQqzZzp4BmJCVrwmUW5ADnzqCgvB3gsruyTQUrEUJ9MtB5zdaQm5MXuipjeZQThTjYCB2aXv/qTdzfKAwds3CoSZ6HA5GNdi6tahsy3CRIn6VtVkvwrqjJcwo+RaRQzjh+C9AUoH2YQmfLbvog62MsnLk/5OPq5HhxO81pm/TJDsI4LXwLh1VviMOWzVvIaXuKwdmYAdgX1NU561bBzeYzi55qBKo4TcMmnOXiV+Es7dDrKjwwpQKsv5tjSqVkcO6ek3I6SI38DXFOBLZtqXOOLsO12iOReYJUWe/+cgBrz12kDCPmaHFzFFZ3+N0GQ/WiYcgqiUItIhb3xJTbPqls0czPCpCzKo57GyTmv17fpfGhBfRoGX/H1zYs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDOnt+7F+RlMaCRRXs8YyuhiHP1FmeDlj4rNB/K2mg1iP9loXXc/XjJ083xMBDu7m7uYLGB+dnmj299Y+RcAQpE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKmxoIJtw8UORlc+o+Q7Pks5ERSyhMLnl+Oo8W221WGj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_iscsi_iqn": "", "ansible_fips": false, "ansible_loadavg": {"1m": 0.44482421875, "5m": 0.35400390625, "15m": 0.1923828125}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fibre_channel_wwn": [], "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 44238 10.31.45.153 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 44238 22", "DEBUG<<< 24160 1726853541.21708: stdout chunk (state=3): >>>INFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_local": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2949, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 582, "free": 2949}, "nocache": {"free": 3288, "used": 243}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_uuid": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 707, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794648064, "block_size": 4096, "block_total": 65519099, "block_available": 63914709, "block_used": 1604390, "inode_total": 131070960, "inode_available": 131029066, "inode_used": 41894, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_pkg_mgr": "dnf", "ansible_interfaces": ["eth0", "lo", "ethtest0", "peerethtest0"], "ansible_eth0": {"device": "eth0", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::3a:e7ff:fe40:bc9f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_ud<<< 24160 1726853541.21750: stdout chunk (state=3): >>>p_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_ethtest0": {"device": "ethtest0", "macaddress": "46:cb:a9:60:52:30", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::44cb:a9ff:fe60:5230", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerethtest0": {"device": "peerethtest0", "macaddress": "ba:78:39:74:8a:f6", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::b878:39ff:fe74:8af6", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.153"], "ansible_all_ipv6_addresses": ["fe80::3a:e7ff:fe40:bc9f", "fe80::44cb:a9ff:fe60:5230", "fe80::b878:39ff:fe74:8af6"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.153", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::3a:e7ff:fe40:bc9f", "fe80::44cb:a9ff:fe60:5230", "fe80::b878:39ff:fe74:8af6"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 24160 1726853541.23681: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853541.23692: stderr chunk (state=3): >>>Shared connection to 10.31.45.153 closed. <<< 24160 1726853541.23720: stdout chunk (state=3): >>><<< 24160 1726853541.23723: stderr chunk (state=3): >>><<< 24160 1726853541.23776: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_is_chroot": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "32", "second": "20", "epoch": "1726853540", "epoch_int": "1726853540", "date": "2024-09-20", "time": "13:32:20", "iso8601_micro": "2024-09-20T17:32:20.832360Z", "iso8601": "2024-09-20T17:32:20Z", "iso8601_basic": "20240920T133220832360", "iso8601_basic_short": "20240920T133220", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_lsb": {}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-153", "ansible_nodename": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26b9e88796a7cb9ebdc2656ce384f6", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCXYnrsBaUY4i0/t1QUWoZkFPnFbGncbkmF01/zUZNuwldCwqYoDxpR2K8ARraEuK9oVLyYO0alCszdGP42db6R4xfRCOhN3faeZXsneupsJk4LLpIBkq0uIokeAtcL1tPOUQQzfsQqzZzp4BmJCVrwmUW5ADnzqCgvB3gsruyTQUrEUJ9MtB5zdaQm5MXuipjeZQThTjYCB2aXv/qTdzfKAwds3CoSZ6HA5GNdi6tahsy3CRIn6VtVkvwrqjJcwo+RaRQzjh+C9AUoH2YQmfLbvog62MsnLk/5OPq5HhxO81pm/TJDsI4LXwLh1VviMOWzVvIaXuKwdmYAdgX1NU561bBzeYzi55qBKo4TcMmnOXiV+Es7dDrKjwwpQKsv5tjSqVkcO6ek3I6SI38DXFOBLZtqXOOLsO12iOReYJUWe/+cgBrz12kDCPmaHFzFFZ3+N0GQ/WiYcgqiUItIhb3xJTbPqls0czPCpCzKo57GyTmv17fpfGhBfRoGX/H1zYs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDOnt+7F+RlMaCRRXs8YyuhiHP1FmeDlj4rNB/K2mg1iP9loXXc/XjJ083xMBDu7m7uYLGB+dnmj299Y+RcAQpE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKmxoIJtw8UORlc+o+Q7Pks5ERSyhMLnl+Oo8W221WGj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_iscsi_iqn": "", "ansible_fips": false, "ansible_loadavg": {"1m": 0.44482421875, "5m": 0.35400390625, "15m": 0.1923828125}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fibre_channel_wwn": [], "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 44238 10.31.45.153 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 44238 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_local": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2949, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 582, "free": 2949}, "nocache": {"free": 3288, "used": 243}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_uuid": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 707, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794648064, "block_size": 4096, "block_total": 65519099, "block_available": 63914709, "block_used": 1604390, "inode_total": 131070960, "inode_available": 131029066, "inode_used": 41894, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_pkg_mgr": "dnf", "ansible_interfaces": ["eth0", "lo", "ethtest0", "peerethtest0"], "ansible_eth0": {"device": "eth0", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::3a:e7ff:fe40:bc9f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_ethtest0": {"device": "ethtest0", "macaddress": "46:cb:a9:60:52:30", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::44cb:a9ff:fe60:5230", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerethtest0": {"device": "peerethtest0", "macaddress": "ba:78:39:74:8a:f6", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::b878:39ff:fe74:8af6", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.153"], "ansible_all_ipv6_addresses": ["fe80::3a:e7ff:fe40:bc9f", "fe80::44cb:a9ff:fe60:5230", "fe80::b878:39ff:fe74:8af6"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.153", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::3a:e7ff:fe40:bc9f", "fe80::44cb:a9ff:fe60:5230", "fe80::b878:39ff:fe74:8af6"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 24160 1726853541.24309: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853540.4691887-24976-60588071047971/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24160 1726853541.24377: _low_level_execute_command(): starting 24160 1726853541.24381: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853540.4691887-24976-60588071047971/ > /dev/null 2>&1 && sleep 0' 24160 1726853541.25014: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853541.25028: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853541.25088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853541.25159: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853541.25198: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853541.25214: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853541.25295: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853541.27292: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853541.27296: stdout chunk (state=3): >>><<< 24160 1726853541.27298: stderr chunk (state=3): >>><<< 24160 1726853541.27300: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853541.27303: handler run complete 24160 1726853541.27467: variable 'ansible_facts' from source: unknown 24160 1726853541.27590: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853541.27992: variable 'ansible_facts' from source: unknown 24160 1726853541.28105: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853541.28378: attempt loop complete, returning result 24160 1726853541.28383: _execute() done 24160 1726853541.28385: dumping result to json 24160 1726853541.28387: done dumping result, returning 24160 1726853541.28390: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [02083763-bbaf-5676-4eb4-000000000454] 24160 1726853541.28392: sending task result for task 02083763-bbaf-5676-4eb4-000000000454 24160 1726853541.29150: done sending task result for task 02083763-bbaf-5676-4eb4-000000000454 24160 1726853541.29156: WORKER PROCESS EXITING ok: [managed_node1] 24160 1726853541.29785: no more pending results, returning what we have 24160 1726853541.29789: results queue empty 24160 1726853541.29790: checking for any_errors_fatal 24160 1726853541.29791: done checking for any_errors_fatal 24160 1726853541.29792: checking for max_fail_percentage 24160 1726853541.29794: done checking for max_fail_percentage 24160 1726853541.29794: checking to see if all hosts have failed and the running result is not ok 24160 1726853541.29795: done checking to see if all hosts have failed 24160 1726853541.29796: getting the remaining hosts for this loop 24160 1726853541.29797: done getting the remaining hosts for this loop 24160 1726853541.29805: getting the next task for host managed_node1 24160 1726853541.29810: done getting next task for host managed_node1 24160 1726853541.29812: ^ task is: TASK: meta (flush_handlers) 24160 1726853541.29814: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853541.29818: getting variables 24160 1726853541.29819: in VariableManager get_vars() 24160 1726853541.29846: Calling all_inventory to load vars for managed_node1 24160 1726853541.29849: Calling groups_inventory to load vars for managed_node1 24160 1726853541.29851: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853541.29926: Calling all_plugins_play to load vars for managed_node1 24160 1726853541.29930: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853541.29933: Calling groups_plugins_play to load vars for managed_node1 24160 1726853541.31575: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853541.34737: done with get_vars() 24160 1726853541.34776: done getting variables 24160 1726853541.34852: in VariableManager get_vars() 24160 1726853541.34869: Calling all_inventory to load vars for managed_node1 24160 1726853541.34878: Calling groups_inventory to load vars for managed_node1 24160 1726853541.34880: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853541.34886: Calling all_plugins_play to load vars for managed_node1 24160 1726853541.34896: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853541.34900: Calling groups_plugins_play to load vars for managed_node1 24160 1726853541.36322: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853541.38207: done with get_vars() 24160 1726853541.38234: done queuing things up, now waiting for results queue to drain 24160 1726853541.38236: results queue empty 24160 1726853541.38237: checking for any_errors_fatal 24160 1726853541.38245: done checking for any_errors_fatal 24160 1726853541.38246: checking for max_fail_percentage 24160 1726853541.38247: done checking for max_fail_percentage 24160 1726853541.38252: checking to see if all hosts have failed and the running result is not ok 24160 1726853541.38253: done checking to see if all hosts have failed 24160 1726853541.38260: getting the remaining hosts for this loop 24160 1726853541.38261: done getting the remaining hosts for this loop 24160 1726853541.38264: getting the next task for host managed_node1 24160 1726853541.38268: done getting next task for host managed_node1 24160 1726853541.38273: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 24160 1726853541.38274: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853541.38284: getting variables 24160 1726853541.38285: in VariableManager get_vars() 24160 1726853541.38300: Calling all_inventory to load vars for managed_node1 24160 1726853541.38302: Calling groups_inventory to load vars for managed_node1 24160 1726853541.38304: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853541.38309: Calling all_plugins_play to load vars for managed_node1 24160 1726853541.38311: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853541.38313: Calling groups_plugins_play to load vars for managed_node1 24160 1726853541.39533: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853541.41279: done with get_vars() 24160 1726853541.41305: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 13:32:21 -0400 (0:00:01.022) 0:00:17.816 ****** 24160 1726853541.41396: entering _queue_task() for managed_node1/include_tasks 24160 1726853541.42294: worker is 1 (out of 1 available) 24160 1726853541.42306: exiting _queue_task() for managed_node1/include_tasks 24160 1726853541.42318: done queuing things up, now waiting for results queue to drain 24160 1726853541.42319: waiting for pending results... 24160 1726853541.43307: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 24160 1726853541.43312: in run() - task 02083763-bbaf-5676-4eb4-000000000067 24160 1726853541.43315: variable 'ansible_search_path' from source: unknown 24160 1726853541.43317: variable 'ansible_search_path' from source: unknown 24160 1726853541.43320: calling self._execute() 24160 1726853541.43323: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853541.43326: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853541.43328: variable 'omit' from source: magic vars 24160 1726853541.44095: variable 'ansible_distribution_major_version' from source: facts 24160 1726853541.44099: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853541.44312: variable 'connection_failed' from source: set_fact 24160 1726853541.44316: Evaluated conditional (not connection_failed): True 24160 1726853541.44518: variable 'ansible_distribution_major_version' from source: facts 24160 1726853541.44535: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853541.44757: variable 'connection_failed' from source: set_fact 24160 1726853541.44768: Evaluated conditional (not connection_failed): True 24160 1726853541.44781: _execute() done 24160 1726853541.44817: dumping result to json 24160 1726853541.44825: done dumping result, returning 24160 1726853541.44837: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [02083763-bbaf-5676-4eb4-000000000067] 24160 1726853541.44853: sending task result for task 02083763-bbaf-5676-4eb4-000000000067 24160 1726853541.45233: no more pending results, returning what we have 24160 1726853541.45238: in VariableManager get_vars() 24160 1726853541.45287: Calling all_inventory to load vars for managed_node1 24160 1726853541.45291: Calling groups_inventory to load vars for managed_node1 24160 1726853541.45293: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853541.45306: Calling all_plugins_play to load vars for managed_node1 24160 1726853541.45308: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853541.45311: Calling groups_plugins_play to load vars for managed_node1 24160 1726853541.46107: done sending task result for task 02083763-bbaf-5676-4eb4-000000000067 24160 1726853541.46111: WORKER PROCESS EXITING 24160 1726853541.48616: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853541.51383: done with get_vars() 24160 1726853541.51418: variable 'ansible_search_path' from source: unknown 24160 1726853541.51420: variable 'ansible_search_path' from source: unknown 24160 1726853541.51468: we have included files to process 24160 1726853541.51470: generating all_blocks data 24160 1726853541.51472: done generating all_blocks data 24160 1726853541.51474: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 24160 1726853541.51475: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 24160 1726853541.51479: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 24160 1726853541.52179: done processing included file 24160 1726853541.52181: iterating over new_blocks loaded from include file 24160 1726853541.52182: in VariableManager get_vars() 24160 1726853541.52204: done with get_vars() 24160 1726853541.52205: filtering new block on tags 24160 1726853541.52234: done filtering new block on tags 24160 1726853541.52237: in VariableManager get_vars() 24160 1726853541.52261: done with get_vars() 24160 1726853541.52263: filtering new block on tags 24160 1726853541.52285: done filtering new block on tags 24160 1726853541.52288: in VariableManager get_vars() 24160 1726853541.52308: done with get_vars() 24160 1726853541.52310: filtering new block on tags 24160 1726853541.52336: done filtering new block on tags 24160 1726853541.52338: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node1 24160 1726853541.52344: extending task lists for all hosts with included blocks 24160 1726853541.52810: done extending task lists 24160 1726853541.52811: done processing included files 24160 1726853541.52812: results queue empty 24160 1726853541.52813: checking for any_errors_fatal 24160 1726853541.52814: done checking for any_errors_fatal 24160 1726853541.52815: checking for max_fail_percentage 24160 1726853541.52816: done checking for max_fail_percentage 24160 1726853541.52817: checking to see if all hosts have failed and the running result is not ok 24160 1726853541.52818: done checking to see if all hosts have failed 24160 1726853541.52818: getting the remaining hosts for this loop 24160 1726853541.52819: done getting the remaining hosts for this loop 24160 1726853541.52822: getting the next task for host managed_node1 24160 1726853541.52826: done getting next task for host managed_node1 24160 1726853541.52829: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 24160 1726853541.52831: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853541.52841: getting variables 24160 1726853541.52842: in VariableManager get_vars() 24160 1726853541.52864: Calling all_inventory to load vars for managed_node1 24160 1726853541.52867: Calling groups_inventory to load vars for managed_node1 24160 1726853541.52869: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853541.52879: Calling all_plugins_play to load vars for managed_node1 24160 1726853541.52881: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853541.52884: Calling groups_plugins_play to load vars for managed_node1 24160 1726853541.54363: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853541.56836: done with get_vars() 24160 1726853541.56861: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 13:32:21 -0400 (0:00:00.155) 0:00:17.972 ****** 24160 1726853541.56941: entering _queue_task() for managed_node1/setup 24160 1726853541.57374: worker is 1 (out of 1 available) 24160 1726853541.57388: exiting _queue_task() for managed_node1/setup 24160 1726853541.57400: done queuing things up, now waiting for results queue to drain 24160 1726853541.57402: waiting for pending results... 24160 1726853541.57765: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 24160 1726853541.57938: in run() - task 02083763-bbaf-5676-4eb4-000000000495 24160 1726853541.57962: variable 'ansible_search_path' from source: unknown 24160 1726853541.57973: variable 'ansible_search_path' from source: unknown 24160 1726853541.58026: calling self._execute() 24160 1726853541.58141: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853541.58156: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853541.58170: variable 'omit' from source: magic vars 24160 1726853541.58766: variable 'ansible_distribution_major_version' from source: facts 24160 1726853541.58769: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853541.58829: variable 'connection_failed' from source: set_fact 24160 1726853541.58923: Evaluated conditional (not connection_failed): True 24160 1726853541.59123: variable 'ansible_distribution_major_version' from source: facts 24160 1726853541.59134: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853541.59252: variable 'connection_failed' from source: set_fact 24160 1726853541.59288: Evaluated conditional (not connection_failed): True 24160 1726853541.59452: variable 'ansible_distribution_major_version' from source: facts 24160 1726853541.59467: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853541.59592: variable 'connection_failed' from source: set_fact 24160 1726853541.59603: Evaluated conditional (not connection_failed): True 24160 1726853541.59723: variable 'ansible_distribution_major_version' from source: facts 24160 1726853541.59741: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853541.59851: variable 'connection_failed' from source: set_fact 24160 1726853541.59873: Evaluated conditional (not connection_failed): True 24160 1726853541.60118: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24160 1726853541.62697: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24160 1726853541.62706: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24160 1726853541.62748: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24160 1726853541.62810: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24160 1726853541.62841: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24160 1726853541.62999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853541.63047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853541.63160: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853541.63211: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853541.63257: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853541.63331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853541.63604: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853541.63608: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853541.63610: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853541.63613: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853541.64004: variable '__network_required_facts' from source: role '' defaults 24160 1726853541.64016: variable 'ansible_facts' from source: unknown 24160 1726853541.65111: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 24160 1726853541.65125: when evaluation is False, skipping this task 24160 1726853541.65137: _execute() done 24160 1726853541.65146: dumping result to json 24160 1726853541.65156: done dumping result, returning 24160 1726853541.65196: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [02083763-bbaf-5676-4eb4-000000000495] 24160 1726853541.65199: sending task result for task 02083763-bbaf-5676-4eb4-000000000495 24160 1726853541.65587: done sending task result for task 02083763-bbaf-5676-4eb4-000000000495 24160 1726853541.65590: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 24160 1726853541.65630: no more pending results, returning what we have 24160 1726853541.65634: results queue empty 24160 1726853541.65635: checking for any_errors_fatal 24160 1726853541.65636: done checking for any_errors_fatal 24160 1726853541.65637: checking for max_fail_percentage 24160 1726853541.65639: done checking for max_fail_percentage 24160 1726853541.65639: checking to see if all hosts have failed and the running result is not ok 24160 1726853541.65640: done checking to see if all hosts have failed 24160 1726853541.65641: getting the remaining hosts for this loop 24160 1726853541.65642: done getting the remaining hosts for this loop 24160 1726853541.65645: getting the next task for host managed_node1 24160 1726853541.65653: done getting next task for host managed_node1 24160 1726853541.65659: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 24160 1726853541.65663: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853541.65678: getting variables 24160 1726853541.65680: in VariableManager get_vars() 24160 1726853541.65716: Calling all_inventory to load vars for managed_node1 24160 1726853541.65719: Calling groups_inventory to load vars for managed_node1 24160 1726853541.65727: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853541.65737: Calling all_plugins_play to load vars for managed_node1 24160 1726853541.65740: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853541.65743: Calling groups_plugins_play to load vars for managed_node1 24160 1726853541.68620: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853541.71725: done with get_vars() 24160 1726853541.71752: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 13:32:21 -0400 (0:00:00.149) 0:00:18.121 ****** 24160 1726853541.71861: entering _queue_task() for managed_node1/stat 24160 1726853541.72212: worker is 1 (out of 1 available) 24160 1726853541.72342: exiting _queue_task() for managed_node1/stat 24160 1726853541.72357: done queuing things up, now waiting for results queue to drain 24160 1726853541.72359: waiting for pending results... 24160 1726853541.72549: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 24160 1726853541.72704: in run() - task 02083763-bbaf-5676-4eb4-000000000497 24160 1726853541.72726: variable 'ansible_search_path' from source: unknown 24160 1726853541.72735: variable 'ansible_search_path' from source: unknown 24160 1726853541.72802: calling self._execute() 24160 1726853541.72896: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853541.72998: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853541.73003: variable 'omit' from source: magic vars 24160 1726853541.73619: variable 'ansible_distribution_major_version' from source: facts 24160 1726853541.73779: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853541.74203: variable 'connection_failed' from source: set_fact 24160 1726853541.74207: Evaluated conditional (not connection_failed): True 24160 1726853541.74365: variable 'ansible_distribution_major_version' from source: facts 24160 1726853541.74446: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853541.74693: variable 'connection_failed' from source: set_fact 24160 1726853541.74703: Evaluated conditional (not connection_failed): True 24160 1726853541.74945: variable 'ansible_distribution_major_version' from source: facts 24160 1726853541.75011: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853541.75292: variable 'connection_failed' from source: set_fact 24160 1726853541.75305: Evaluated conditional (not connection_failed): True 24160 1726853541.75678: variable 'ansible_distribution_major_version' from source: facts 24160 1726853541.75682: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853541.75719: variable 'connection_failed' from source: set_fact 24160 1726853541.75730: Evaluated conditional (not connection_failed): True 24160 1726853541.75883: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24160 1726853541.76155: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24160 1726853541.76204: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24160 1726853541.76243: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24160 1726853541.76476: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24160 1726853541.76480: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24160 1726853541.76482: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24160 1726853541.76485: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853541.76492: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24160 1726853541.76579: variable '__network_is_ostree' from source: set_fact 24160 1726853541.76591: Evaluated conditional (not __network_is_ostree is defined): False 24160 1726853541.76598: when evaluation is False, skipping this task 24160 1726853541.76606: _execute() done 24160 1726853541.76613: dumping result to json 24160 1726853541.76624: done dumping result, returning 24160 1726853541.76637: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [02083763-bbaf-5676-4eb4-000000000497] 24160 1726853541.76646: sending task result for task 02083763-bbaf-5676-4eb4-000000000497 skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 24160 1726853541.76808: no more pending results, returning what we have 24160 1726853541.76812: results queue empty 24160 1726853541.76813: checking for any_errors_fatal 24160 1726853541.76820: done checking for any_errors_fatal 24160 1726853541.76821: checking for max_fail_percentage 24160 1726853541.76822: done checking for max_fail_percentage 24160 1726853541.76823: checking to see if all hosts have failed and the running result is not ok 24160 1726853541.76830: done checking to see if all hosts have failed 24160 1726853541.76831: getting the remaining hosts for this loop 24160 1726853541.76832: done getting the remaining hosts for this loop 24160 1726853541.76876: getting the next task for host managed_node1 24160 1726853541.76883: done getting next task for host managed_node1 24160 1726853541.76887: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 24160 1726853541.76891: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853541.76908: done sending task result for task 02083763-bbaf-5676-4eb4-000000000497 24160 1726853541.76912: WORKER PROCESS EXITING 24160 1726853541.76919: getting variables 24160 1726853541.76920: in VariableManager get_vars() 24160 1726853541.76998: Calling all_inventory to load vars for managed_node1 24160 1726853541.77001: Calling groups_inventory to load vars for managed_node1 24160 1726853541.77003: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853541.77010: Calling all_plugins_play to load vars for managed_node1 24160 1726853541.77013: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853541.77015: Calling groups_plugins_play to load vars for managed_node1 24160 1726853541.78312: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853541.79925: done with get_vars() 24160 1726853541.79944: done getting variables 24160 1726853541.80006: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 13:32:21 -0400 (0:00:00.081) 0:00:18.203 ****** 24160 1726853541.80041: entering _queue_task() for managed_node1/set_fact 24160 1726853541.80364: worker is 1 (out of 1 available) 24160 1726853541.80378: exiting _queue_task() for managed_node1/set_fact 24160 1726853541.80390: done queuing things up, now waiting for results queue to drain 24160 1726853541.80392: waiting for pending results... 24160 1726853541.80667: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 24160 1726853541.80801: in run() - task 02083763-bbaf-5676-4eb4-000000000498 24160 1726853541.80822: variable 'ansible_search_path' from source: unknown 24160 1726853541.80829: variable 'ansible_search_path' from source: unknown 24160 1726853541.80867: calling self._execute() 24160 1726853541.80978: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853541.81005: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853541.81009: variable 'omit' from source: magic vars 24160 1726853541.81396: variable 'ansible_distribution_major_version' from source: facts 24160 1726853541.81441: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853541.81532: variable 'connection_failed' from source: set_fact 24160 1726853541.81546: Evaluated conditional (not connection_failed): True 24160 1726853541.81663: variable 'ansible_distribution_major_version' from source: facts 24160 1726853541.81770: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853541.81775: variable 'connection_failed' from source: set_fact 24160 1726853541.81785: Evaluated conditional (not connection_failed): True 24160 1726853541.81896: variable 'ansible_distribution_major_version' from source: facts 24160 1726853541.81907: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853541.82005: variable 'connection_failed' from source: set_fact 24160 1726853541.82016: Evaluated conditional (not connection_failed): True 24160 1726853541.82123: variable 'ansible_distribution_major_version' from source: facts 24160 1726853541.82135: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853541.82237: variable 'connection_failed' from source: set_fact 24160 1726853541.82248: Evaluated conditional (not connection_failed): True 24160 1726853541.82409: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24160 1726853541.82690: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24160 1726853541.82737: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24160 1726853541.82976: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24160 1726853541.82979: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24160 1726853541.82983: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24160 1726853541.82985: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24160 1726853541.82998: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853541.83030: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24160 1726853541.83123: variable '__network_is_ostree' from source: set_fact 24160 1726853541.83136: Evaluated conditional (not __network_is_ostree is defined): False 24160 1726853541.83143: when evaluation is False, skipping this task 24160 1726853541.83151: _execute() done 24160 1726853541.83159: dumping result to json 24160 1726853541.83166: done dumping result, returning 24160 1726853541.83181: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [02083763-bbaf-5676-4eb4-000000000498] 24160 1726853541.83192: sending task result for task 02083763-bbaf-5676-4eb4-000000000498 skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 24160 1726853541.83362: no more pending results, returning what we have 24160 1726853541.83366: results queue empty 24160 1726853541.83367: checking for any_errors_fatal 24160 1726853541.83377: done checking for any_errors_fatal 24160 1726853541.83378: checking for max_fail_percentage 24160 1726853541.83380: done checking for max_fail_percentage 24160 1726853541.83381: checking to see if all hosts have failed and the running result is not ok 24160 1726853541.83382: done checking to see if all hosts have failed 24160 1726853541.83382: getting the remaining hosts for this loop 24160 1726853541.83384: done getting the remaining hosts for this loop 24160 1726853541.83388: getting the next task for host managed_node1 24160 1726853541.83398: done getting next task for host managed_node1 24160 1726853541.83401: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 24160 1726853541.83405: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853541.83420: getting variables 24160 1726853541.83422: in VariableManager get_vars() 24160 1726853541.83463: Calling all_inventory to load vars for managed_node1 24160 1726853541.83466: Calling groups_inventory to load vars for managed_node1 24160 1726853541.83469: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853541.83682: Calling all_plugins_play to load vars for managed_node1 24160 1726853541.83686: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853541.83690: Calling groups_plugins_play to load vars for managed_node1 24160 1726853541.84209: done sending task result for task 02083763-bbaf-5676-4eb4-000000000498 24160 1726853541.84219: WORKER PROCESS EXITING 24160 1726853541.85394: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853541.86994: done with get_vars() 24160 1726853541.87023: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 13:32:21 -0400 (0:00:00.070) 0:00:18.273 ****** 24160 1726853541.87127: entering _queue_task() for managed_node1/service_facts 24160 1726853541.87701: worker is 1 (out of 1 available) 24160 1726853541.87710: exiting _queue_task() for managed_node1/service_facts 24160 1726853541.87721: done queuing things up, now waiting for results queue to drain 24160 1726853541.87722: waiting for pending results... 24160 1726853541.87792: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running 24160 1726853541.87944: in run() - task 02083763-bbaf-5676-4eb4-00000000049a 24160 1726853541.87949: variable 'ansible_search_path' from source: unknown 24160 1726853541.87956: variable 'ansible_search_path' from source: unknown 24160 1726853541.87961: calling self._execute() 24160 1726853541.88080: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853541.88084: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853541.88087: variable 'omit' from source: magic vars 24160 1726853541.88490: variable 'ansible_distribution_major_version' from source: facts 24160 1726853541.88494: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853541.88582: variable 'connection_failed' from source: set_fact 24160 1726853541.88587: Evaluated conditional (not connection_failed): True 24160 1726853541.88695: variable 'ansible_distribution_major_version' from source: facts 24160 1726853541.88816: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853541.88820: variable 'connection_failed' from source: set_fact 24160 1726853541.88823: Evaluated conditional (not connection_failed): True 24160 1726853541.88911: variable 'ansible_distribution_major_version' from source: facts 24160 1726853541.88922: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853541.89011: variable 'connection_failed' from source: set_fact 24160 1726853541.89015: Evaluated conditional (not connection_failed): True 24160 1726853541.89122: variable 'ansible_distribution_major_version' from source: facts 24160 1726853541.89125: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853541.89219: variable 'connection_failed' from source: set_fact 24160 1726853541.89223: Evaluated conditional (not connection_failed): True 24160 1726853541.89231: variable 'omit' from source: magic vars 24160 1726853541.89291: variable 'omit' from source: magic vars 24160 1726853541.89323: variable 'omit' from source: magic vars 24160 1726853541.89361: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24160 1726853541.89397: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24160 1726853541.89415: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24160 1726853541.89432: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853541.89442: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853541.89469: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24160 1726853541.89475: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853541.89480: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853541.89581: Set connection var ansible_shell_executable to /bin/sh 24160 1726853541.89586: Set connection var ansible_pipelining to False 24160 1726853541.89589: Set connection var ansible_connection to ssh 24160 1726853541.89591: Set connection var ansible_shell_type to sh 24160 1726853541.89598: Set connection var ansible_module_compression to ZIP_DEFLATED 24160 1726853541.89609: Set connection var ansible_timeout to 10 24160 1726853541.89629: variable 'ansible_shell_executable' from source: unknown 24160 1726853541.89632: variable 'ansible_connection' from source: unknown 24160 1726853541.89635: variable 'ansible_module_compression' from source: unknown 24160 1726853541.89638: variable 'ansible_shell_type' from source: unknown 24160 1726853541.89640: variable 'ansible_shell_executable' from source: unknown 24160 1726853541.89642: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853541.89689: variable 'ansible_pipelining' from source: unknown 24160 1726853541.89692: variable 'ansible_timeout' from source: unknown 24160 1726853541.89695: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853541.89833: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 24160 1726853541.89847: variable 'omit' from source: magic vars 24160 1726853541.89855: starting attempt loop 24160 1726853541.89861: running the handler 24160 1726853541.89879: _low_level_execute_command(): starting 24160 1726853541.89892: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24160 1726853541.90685: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853541.90701: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853541.90796: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853541.92487: stdout chunk (state=3): >>>/root <<< 24160 1726853541.92643: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853541.92648: stdout chunk (state=3): >>><<< 24160 1726853541.92650: stderr chunk (state=3): >>><<< 24160 1726853541.92780: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853541.92784: _low_level_execute_command(): starting 24160 1726853541.92795: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853541.9268253-25045-201015404898745 `" && echo ansible-tmp-1726853541.9268253-25045-201015404898745="` echo /root/.ansible/tmp/ansible-tmp-1726853541.9268253-25045-201015404898745 `" ) && sleep 0' 24160 1726853541.93362: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853541.93411: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853541.93423: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853541.93448: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853541.93521: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853541.95400: stdout chunk (state=3): >>>ansible-tmp-1726853541.9268253-25045-201015404898745=/root/.ansible/tmp/ansible-tmp-1726853541.9268253-25045-201015404898745 <<< 24160 1726853541.95542: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853541.95557: stderr chunk (state=3): >>><<< 24160 1726853541.95567: stdout chunk (state=3): >>><<< 24160 1726853541.95592: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853541.9268253-25045-201015404898745=/root/.ansible/tmp/ansible-tmp-1726853541.9268253-25045-201015404898745 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853541.95777: variable 'ansible_module_compression' from source: unknown 24160 1726853541.95780: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24160jdl187cr/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 24160 1726853541.95783: variable 'ansible_facts' from source: unknown 24160 1726853541.95832: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853541.9268253-25045-201015404898745/AnsiballZ_service_facts.py 24160 1726853541.96048: Sending initial data 24160 1726853541.96060: Sent initial data (162 bytes) 24160 1726853541.96583: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853541.96599: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853541.96615: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853541.96688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853541.96734: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853541.96752: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853541.96778: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853541.96848: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853541.98359: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24160 1726853541.98415: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24160 1726853541.98481: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24160jdl187cr/tmprvlbekas /root/.ansible/tmp/ansible-tmp-1726853541.9268253-25045-201015404898745/AnsiballZ_service_facts.py <<< 24160 1726853541.98502: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853541.9268253-25045-201015404898745/AnsiballZ_service_facts.py" <<< 24160 1726853541.98525: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24160jdl187cr/tmprvlbekas" to remote "/root/.ansible/tmp/ansible-tmp-1726853541.9268253-25045-201015404898745/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853541.9268253-25045-201015404898745/AnsiballZ_service_facts.py" <<< 24160 1726853541.99343: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853541.99400: stderr chunk (state=3): >>><<< 24160 1726853541.99409: stdout chunk (state=3): >>><<< 24160 1726853541.99649: done transferring module to remote 24160 1726853541.99732: _low_level_execute_command(): starting 24160 1726853541.99736: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853541.9268253-25045-201015404898745/ /root/.ansible/tmp/ansible-tmp-1726853541.9268253-25045-201015404898745/AnsiballZ_service_facts.py && sleep 0' 24160 1726853542.00233: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853542.00251: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853542.00266: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853542.00284: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24160 1726853542.00301: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 24160 1726853542.00393: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853542.00406: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853542.00424: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853542.00494: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853542.00552: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853542.02339: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853542.02347: stdout chunk (state=3): >>><<< 24160 1726853542.02365: stderr chunk (state=3): >>><<< 24160 1726853542.02387: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853542.02394: _low_level_execute_command(): starting 24160 1726853542.02403: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853541.9268253-25045-201015404898745/AnsiballZ_service_facts.py && sleep 0' 24160 1726853542.03041: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853542.03059: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853542.03074: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853542.03093: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24160 1726853542.03144: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853542.03214: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853542.03257: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853542.03278: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853542.03364: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853543.57307: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state":<<< 24160 1726853543.57323: stdout chunk (state=3): >>> "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 24160 1726853543.59060: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 24160 1726853543.59067: stdout chunk (state=3): >>><<< 24160 1726853543.59276: stderr chunk (state=3): >>><<< 24160 1726853543.59282: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 24160 1726853543.60195: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853541.9268253-25045-201015404898745/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24160 1726853543.60211: _low_level_execute_command(): starting 24160 1726853543.60222: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853541.9268253-25045-201015404898745/ > /dev/null 2>&1 && sleep 0' 24160 1726853543.61183: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853543.61544: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853543.61641: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853543.61710: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853543.63626: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853543.63653: stdout chunk (state=3): >>><<< 24160 1726853543.63657: stderr chunk (state=3): >>><<< 24160 1726853543.63679: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853543.63699: handler run complete 24160 1726853543.63909: variable 'ansible_facts' from source: unknown 24160 1726853543.64074: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853543.64586: variable 'ansible_facts' from source: unknown 24160 1726853543.64736: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853543.65049: attempt loop complete, returning result 24160 1726853543.65387: _execute() done 24160 1726853543.65391: dumping result to json 24160 1726853543.65393: done dumping result, returning 24160 1726853543.65395: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running [02083763-bbaf-5676-4eb4-00000000049a] 24160 1726853543.65397: sending task result for task 02083763-bbaf-5676-4eb4-00000000049a ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 24160 1726853543.68156: no more pending results, returning what we have 24160 1726853543.68159: results queue empty 24160 1726853543.68160: checking for any_errors_fatal 24160 1726853543.68163: done checking for any_errors_fatal 24160 1726853543.68164: checking for max_fail_percentage 24160 1726853543.68165: done checking for max_fail_percentage 24160 1726853543.68166: checking to see if all hosts have failed and the running result is not ok 24160 1726853543.68167: done checking to see if all hosts have failed 24160 1726853543.68168: getting the remaining hosts for this loop 24160 1726853543.68169: done getting the remaining hosts for this loop 24160 1726853543.68178: getting the next task for host managed_node1 24160 1726853543.68183: done getting next task for host managed_node1 24160 1726853543.68186: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 24160 1726853543.68188: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853543.68198: getting variables 24160 1726853543.68199: in VariableManager get_vars() 24160 1726853543.68227: Calling all_inventory to load vars for managed_node1 24160 1726853543.68232: Calling groups_inventory to load vars for managed_node1 24160 1726853543.68234: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853543.68242: Calling all_plugins_play to load vars for managed_node1 24160 1726853543.68245: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853543.68248: Calling groups_plugins_play to load vars for managed_node1 24160 1726853543.68793: done sending task result for task 02083763-bbaf-5676-4eb4-00000000049a 24160 1726853543.68796: WORKER PROCESS EXITING 24160 1726853543.69949: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853543.72445: done with get_vars() 24160 1726853543.72473: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 13:32:23 -0400 (0:00:01.854) 0:00:20.128 ****** 24160 1726853543.72609: entering _queue_task() for managed_node1/package_facts 24160 1726853543.73428: worker is 1 (out of 1 available) 24160 1726853543.73441: exiting _queue_task() for managed_node1/package_facts 24160 1726853543.73450: done queuing things up, now waiting for results queue to drain 24160 1726853543.73452: waiting for pending results... 24160 1726853543.73596: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 24160 1726853543.73738: in run() - task 02083763-bbaf-5676-4eb4-00000000049b 24160 1726853543.73764: variable 'ansible_search_path' from source: unknown 24160 1726853543.73775: variable 'ansible_search_path' from source: unknown 24160 1726853543.73941: calling self._execute() 24160 1726853543.74187: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853543.74199: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853543.74214: variable 'omit' from source: magic vars 24160 1726853543.75116: variable 'ansible_distribution_major_version' from source: facts 24160 1726853543.75135: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853543.75259: variable 'connection_failed' from source: set_fact 24160 1726853543.75273: Evaluated conditional (not connection_failed): True 24160 1726853543.75391: variable 'ansible_distribution_major_version' from source: facts 24160 1726853543.75425: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853543.75514: variable 'connection_failed' from source: set_fact 24160 1726853543.75531: Evaluated conditional (not connection_failed): True 24160 1726853543.75647: variable 'ansible_distribution_major_version' from source: facts 24160 1726853543.75661: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853543.75770: variable 'connection_failed' from source: set_fact 24160 1726853543.75784: Evaluated conditional (not connection_failed): True 24160 1726853543.75901: variable 'ansible_distribution_major_version' from source: facts 24160 1726853543.75912: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853543.76020: variable 'connection_failed' from source: set_fact 24160 1726853543.76029: Evaluated conditional (not connection_failed): True 24160 1726853543.76079: variable 'omit' from source: magic vars 24160 1726853543.76106: variable 'omit' from source: magic vars 24160 1726853543.76143: variable 'omit' from source: magic vars 24160 1726853543.76192: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24160 1726853543.76227: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24160 1726853543.76251: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24160 1726853543.76275: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853543.76297: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853543.76375: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24160 1726853543.76378: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853543.76380: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853543.76440: Set connection var ansible_shell_executable to /bin/sh 24160 1726853543.76449: Set connection var ansible_pipelining to False 24160 1726853543.76459: Set connection var ansible_connection to ssh 24160 1726853543.76464: Set connection var ansible_shell_type to sh 24160 1726853543.76476: Set connection var ansible_module_compression to ZIP_DEFLATED 24160 1726853543.76488: Set connection var ansible_timeout to 10 24160 1726853543.76516: variable 'ansible_shell_executable' from source: unknown 24160 1726853543.76522: variable 'ansible_connection' from source: unknown 24160 1726853543.76528: variable 'ansible_module_compression' from source: unknown 24160 1726853543.76533: variable 'ansible_shell_type' from source: unknown 24160 1726853543.76538: variable 'ansible_shell_executable' from source: unknown 24160 1726853543.76545: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853543.76620: variable 'ansible_pipelining' from source: unknown 24160 1726853543.76623: variable 'ansible_timeout' from source: unknown 24160 1726853543.76625: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853543.76768: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 24160 1726853543.76787: variable 'omit' from source: magic vars 24160 1726853543.76797: starting attempt loop 24160 1726853543.76806: running the handler 24160 1726853543.76824: _low_level_execute_command(): starting 24160 1726853543.76843: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24160 1726853543.77584: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853543.77609: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853543.77689: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853543.77739: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853543.77764: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853543.77782: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853543.77869: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853543.79573: stdout chunk (state=3): >>>/root <<< 24160 1726853543.79666: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853543.79728: stderr chunk (state=3): >>><<< 24160 1726853543.79761: stdout chunk (state=3): >>><<< 24160 1726853543.79794: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853543.79825: _low_level_execute_command(): starting 24160 1726853543.79841: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853543.7980947-25130-164265978419254 `" && echo ansible-tmp-1726853543.7980947-25130-164265978419254="` echo /root/.ansible/tmp/ansible-tmp-1726853543.7980947-25130-164265978419254 `" ) && sleep 0' 24160 1726853543.80515: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853543.80531: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24160 1726853543.80585: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853543.80632: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853543.80646: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853543.80665: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853543.80975: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853543.82685: stdout chunk (state=3): >>>ansible-tmp-1726853543.7980947-25130-164265978419254=/root/.ansible/tmp/ansible-tmp-1726853543.7980947-25130-164265978419254 <<< 24160 1726853543.82853: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853543.82857: stdout chunk (state=3): >>><<< 24160 1726853543.82859: stderr chunk (state=3): >>><<< 24160 1726853543.82879: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853543.7980947-25130-164265978419254=/root/.ansible/tmp/ansible-tmp-1726853543.7980947-25130-164265978419254 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853543.82941: variable 'ansible_module_compression' from source: unknown 24160 1726853543.82997: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24160jdl187cr/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 24160 1726853543.83202: variable 'ansible_facts' from source: unknown 24160 1726853543.83661: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853543.7980947-25130-164265978419254/AnsiballZ_package_facts.py 24160 1726853543.83866: Sending initial data 24160 1726853543.83967: Sent initial data (162 bytes) 24160 1726853543.84745: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853543.84762: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853543.84781: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853543.84834: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853543.84909: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853543.84940: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853543.84961: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853543.85037: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853543.86637: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24160 1726853543.86676: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24160 1726853543.86861: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24160jdl187cr/tmp2ngcsqho /root/.ansible/tmp/ansible-tmp-1726853543.7980947-25130-164265978419254/AnsiballZ_package_facts.py <<< 24160 1726853543.86866: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853543.7980947-25130-164265978419254/AnsiballZ_package_facts.py" <<< 24160 1726853543.86869: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24160jdl187cr/tmp2ngcsqho" to remote "/root/.ansible/tmp/ansible-tmp-1726853543.7980947-25130-164265978419254/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853543.7980947-25130-164265978419254/AnsiballZ_package_facts.py" <<< 24160 1726853543.89699: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853543.89713: stdout chunk (state=3): >>><<< 24160 1726853543.89736: stderr chunk (state=3): >>><<< 24160 1726853543.89760: done transferring module to remote 24160 1726853543.89881: _low_level_execute_command(): starting 24160 1726853543.89885: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853543.7980947-25130-164265978419254/ /root/.ansible/tmp/ansible-tmp-1726853543.7980947-25130-164265978419254/AnsiballZ_package_facts.py && sleep 0' 24160 1726853543.90609: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853543.90618: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853543.90629: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853543.90645: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24160 1726853543.90661: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 24160 1726853543.90669: stderr chunk (state=3): >>>debug2: match not found <<< 24160 1726853543.90713: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853543.90772: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853543.90804: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853543.90841: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853543.90859: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853543.92953: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853543.92956: stdout chunk (state=3): >>><<< 24160 1726853543.92959: stderr chunk (state=3): >>><<< 24160 1726853543.92961: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853543.92963: _low_level_execute_command(): starting 24160 1726853543.92966: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853543.7980947-25130-164265978419254/AnsiballZ_package_facts.py && sleep 0' 24160 1726853543.93502: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853543.93520: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853543.93535: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853543.93550: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24160 1726853543.93566: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 24160 1726853543.93625: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853543.93673: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853543.93693: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853543.93720: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853543.93812: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853544.38315: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arc<<< 24160 1726853544.38358: stdout chunk (state=3): >>>h": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 24160 1726853544.39994: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 24160 1726853544.40093: stderr chunk (state=3): >>><<< 24160 1726853544.40103: stdout chunk (state=3): >>><<< 24160 1726853544.40280: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 24160 1726853544.44822: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853543.7980947-25130-164265978419254/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24160 1726853544.44826: _low_level_execute_command(): starting 24160 1726853544.44829: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853543.7980947-25130-164265978419254/ > /dev/null 2>&1 && sleep 0' 24160 1726853544.46060: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853544.46079: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853544.46092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 24160 1726853544.46236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853544.46322: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853544.46369: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853544.48309: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853544.48513: stderr chunk (state=3): >>><<< 24160 1726853544.48516: stdout chunk (state=3): >>><<< 24160 1726853544.48518: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853544.48521: handler run complete 24160 1726853544.49561: variable 'ansible_facts' from source: unknown 24160 1726853544.50082: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853544.52228: variable 'ansible_facts' from source: unknown 24160 1726853544.53248: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853544.54105: attempt loop complete, returning result 24160 1726853544.54123: _execute() done 24160 1726853544.54133: dumping result to json 24160 1726853544.54349: done dumping result, returning 24160 1726853544.54356: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [02083763-bbaf-5676-4eb4-00000000049b] 24160 1726853544.54367: sending task result for task 02083763-bbaf-5676-4eb4-00000000049b 24160 1726853544.57341: done sending task result for task 02083763-bbaf-5676-4eb4-00000000049b 24160 1726853544.57345: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 24160 1726853544.57505: no more pending results, returning what we have 24160 1726853544.57508: results queue empty 24160 1726853544.57509: checking for any_errors_fatal 24160 1726853544.57514: done checking for any_errors_fatal 24160 1726853544.57514: checking for max_fail_percentage 24160 1726853544.57516: done checking for max_fail_percentage 24160 1726853544.57516: checking to see if all hosts have failed and the running result is not ok 24160 1726853544.57517: done checking to see if all hosts have failed 24160 1726853544.57518: getting the remaining hosts for this loop 24160 1726853544.57519: done getting the remaining hosts for this loop 24160 1726853544.57522: getting the next task for host managed_node1 24160 1726853544.57528: done getting next task for host managed_node1 24160 1726853544.57531: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 24160 1726853544.57533: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853544.57543: getting variables 24160 1726853544.57544: in VariableManager get_vars() 24160 1726853544.57777: Calling all_inventory to load vars for managed_node1 24160 1726853544.57781: Calling groups_inventory to load vars for managed_node1 24160 1726853544.57784: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853544.57793: Calling all_plugins_play to load vars for managed_node1 24160 1726853544.57795: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853544.57798: Calling groups_plugins_play to load vars for managed_node1 24160 1726853544.59244: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853544.60902: done with get_vars() 24160 1726853544.60936: done getting variables 24160 1726853544.60999: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 13:32:24 -0400 (0:00:00.884) 0:00:21.013 ****** 24160 1726853544.61032: entering _queue_task() for managed_node1/debug 24160 1726853544.61395: worker is 1 (out of 1 available) 24160 1726853544.61409: exiting _queue_task() for managed_node1/debug 24160 1726853544.61423: done queuing things up, now waiting for results queue to drain 24160 1726853544.61425: waiting for pending results... 24160 1726853544.61720: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 24160 1726853544.61842: in run() - task 02083763-bbaf-5676-4eb4-000000000068 24160 1726853544.61898: variable 'ansible_search_path' from source: unknown 24160 1726853544.61902: variable 'ansible_search_path' from source: unknown 24160 1726853544.61921: calling self._execute() 24160 1726853544.62030: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853544.62042: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853544.62130: variable 'omit' from source: magic vars 24160 1726853544.62742: variable 'ansible_distribution_major_version' from source: facts 24160 1726853544.62746: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853544.62808: variable 'connection_failed' from source: set_fact 24160 1726853544.62819: Evaluated conditional (not connection_failed): True 24160 1726853544.63177: variable 'ansible_distribution_major_version' from source: facts 24160 1726853544.63180: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853544.63229: variable 'connection_failed' from source: set_fact 24160 1726853544.63292: Evaluated conditional (not connection_failed): True 24160 1726853544.63303: variable 'omit' from source: magic vars 24160 1726853544.63342: variable 'omit' from source: magic vars 24160 1726853544.63584: variable 'network_provider' from source: set_fact 24160 1726853544.63613: variable 'omit' from source: magic vars 24160 1726853544.63656: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24160 1726853544.63756: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24160 1726853544.63785: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24160 1726853544.63822: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853544.63855: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853544.63896: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24160 1726853544.63906: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853544.63915: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853544.64046: Set connection var ansible_shell_executable to /bin/sh 24160 1726853544.64050: Set connection var ansible_pipelining to False 24160 1726853544.64052: Set connection var ansible_connection to ssh 24160 1726853544.64155: Set connection var ansible_shell_type to sh 24160 1726853544.64158: Set connection var ansible_module_compression to ZIP_DEFLATED 24160 1726853544.64161: Set connection var ansible_timeout to 10 24160 1726853544.64163: variable 'ansible_shell_executable' from source: unknown 24160 1726853544.64166: variable 'ansible_connection' from source: unknown 24160 1726853544.64168: variable 'ansible_module_compression' from source: unknown 24160 1726853544.64173: variable 'ansible_shell_type' from source: unknown 24160 1726853544.64176: variable 'ansible_shell_executable' from source: unknown 24160 1726853544.64178: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853544.64180: variable 'ansible_pipelining' from source: unknown 24160 1726853544.64182: variable 'ansible_timeout' from source: unknown 24160 1726853544.64184: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853544.64301: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 24160 1726853544.64317: variable 'omit' from source: magic vars 24160 1726853544.64327: starting attempt loop 24160 1726853544.64334: running the handler 24160 1726853544.64388: handler run complete 24160 1726853544.64406: attempt loop complete, returning result 24160 1726853544.64414: _execute() done 24160 1726853544.64421: dumping result to json 24160 1726853544.64429: done dumping result, returning 24160 1726853544.64440: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [02083763-bbaf-5676-4eb4-000000000068] 24160 1726853544.64450: sending task result for task 02083763-bbaf-5676-4eb4-000000000068 ok: [managed_node1] => {} MSG: Using network provider: nm 24160 1726853544.64641: no more pending results, returning what we have 24160 1726853544.64644: results queue empty 24160 1726853544.64645: checking for any_errors_fatal 24160 1726853544.64657: done checking for any_errors_fatal 24160 1726853544.64658: checking for max_fail_percentage 24160 1726853544.64660: done checking for max_fail_percentage 24160 1726853544.64660: checking to see if all hosts have failed and the running result is not ok 24160 1726853544.64661: done checking to see if all hosts have failed 24160 1726853544.64662: getting the remaining hosts for this loop 24160 1726853544.64663: done getting the remaining hosts for this loop 24160 1726853544.64667: getting the next task for host managed_node1 24160 1726853544.64676: done getting next task for host managed_node1 24160 1726853544.64681: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 24160 1726853544.64683: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853544.64693: getting variables 24160 1726853544.64695: in VariableManager get_vars() 24160 1726853544.64734: Calling all_inventory to load vars for managed_node1 24160 1726853544.64738: Calling groups_inventory to load vars for managed_node1 24160 1726853544.64740: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853544.64751: Calling all_plugins_play to load vars for managed_node1 24160 1726853544.64754: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853544.64758: Calling groups_plugins_play to load vars for managed_node1 24160 1726853544.65485: done sending task result for task 02083763-bbaf-5676-4eb4-000000000068 24160 1726853544.65488: WORKER PROCESS EXITING 24160 1726853544.66835: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853544.68722: done with get_vars() 24160 1726853544.68740: done getting variables 24160 1726853544.68792: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 13:32:24 -0400 (0:00:00.077) 0:00:21.090 ****** 24160 1726853544.68823: entering _queue_task() for managed_node1/fail 24160 1726853544.69073: worker is 1 (out of 1 available) 24160 1726853544.69086: exiting _queue_task() for managed_node1/fail 24160 1726853544.69102: done queuing things up, now waiting for results queue to drain 24160 1726853544.69104: waiting for pending results... 24160 1726853544.69303: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 24160 1726853544.69377: in run() - task 02083763-bbaf-5676-4eb4-000000000069 24160 1726853544.69388: variable 'ansible_search_path' from source: unknown 24160 1726853544.69391: variable 'ansible_search_path' from source: unknown 24160 1726853544.69422: calling self._execute() 24160 1726853544.69489: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853544.69494: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853544.69502: variable 'omit' from source: magic vars 24160 1726853544.69778: variable 'ansible_distribution_major_version' from source: facts 24160 1726853544.69788: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853544.69867: variable 'connection_failed' from source: set_fact 24160 1726853544.69872: Evaluated conditional (not connection_failed): True 24160 1726853544.69946: variable 'ansible_distribution_major_version' from source: facts 24160 1726853544.69949: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853544.70021: variable 'connection_failed' from source: set_fact 24160 1726853544.70024: Evaluated conditional (not connection_failed): True 24160 1726853544.70124: variable 'network_state' from source: role '' defaults 24160 1726853544.70133: Evaluated conditional (network_state != {}): False 24160 1726853544.70137: when evaluation is False, skipping this task 24160 1726853544.70140: _execute() done 24160 1726853544.70143: dumping result to json 24160 1726853544.70145: done dumping result, returning 24160 1726853544.70152: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [02083763-bbaf-5676-4eb4-000000000069] 24160 1726853544.70159: sending task result for task 02083763-bbaf-5676-4eb4-000000000069 24160 1726853544.70249: done sending task result for task 02083763-bbaf-5676-4eb4-000000000069 24160 1726853544.70252: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 24160 1726853544.70300: no more pending results, returning what we have 24160 1726853544.70304: results queue empty 24160 1726853544.70305: checking for any_errors_fatal 24160 1726853544.70311: done checking for any_errors_fatal 24160 1726853544.70312: checking for max_fail_percentage 24160 1726853544.70314: done checking for max_fail_percentage 24160 1726853544.70314: checking to see if all hosts have failed and the running result is not ok 24160 1726853544.70315: done checking to see if all hosts have failed 24160 1726853544.70316: getting the remaining hosts for this loop 24160 1726853544.70317: done getting the remaining hosts for this loop 24160 1726853544.70320: getting the next task for host managed_node1 24160 1726853544.70326: done getting next task for host managed_node1 24160 1726853544.70329: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 24160 1726853544.70332: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853544.70346: getting variables 24160 1726853544.70347: in VariableManager get_vars() 24160 1726853544.70503: Calling all_inventory to load vars for managed_node1 24160 1726853544.70506: Calling groups_inventory to load vars for managed_node1 24160 1726853544.70508: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853544.70516: Calling all_plugins_play to load vars for managed_node1 24160 1726853544.70519: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853544.70521: Calling groups_plugins_play to load vars for managed_node1 24160 1726853544.74896: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853544.75753: done with get_vars() 24160 1726853544.75770: done getting variables 24160 1726853544.75808: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 13:32:24 -0400 (0:00:00.070) 0:00:21.160 ****** 24160 1726853544.75828: entering _queue_task() for managed_node1/fail 24160 1726853544.76089: worker is 1 (out of 1 available) 24160 1726853544.76102: exiting _queue_task() for managed_node1/fail 24160 1726853544.76117: done queuing things up, now waiting for results queue to drain 24160 1726853544.76119: waiting for pending results... 24160 1726853544.76324: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 24160 1726853544.76401: in run() - task 02083763-bbaf-5676-4eb4-00000000006a 24160 1726853544.76412: variable 'ansible_search_path' from source: unknown 24160 1726853544.76416: variable 'ansible_search_path' from source: unknown 24160 1726853544.76444: calling self._execute() 24160 1726853544.76535: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853544.76542: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853544.76552: variable 'omit' from source: magic vars 24160 1726853544.76834: variable 'ansible_distribution_major_version' from source: facts 24160 1726853544.76844: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853544.76921: variable 'connection_failed' from source: set_fact 24160 1726853544.76925: Evaluated conditional (not connection_failed): True 24160 1726853544.77001: variable 'ansible_distribution_major_version' from source: facts 24160 1726853544.77005: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853544.77070: variable 'connection_failed' from source: set_fact 24160 1726853544.77075: Evaluated conditional (not connection_failed): True 24160 1726853544.77151: variable 'network_state' from source: role '' defaults 24160 1726853544.77162: Evaluated conditional (network_state != {}): False 24160 1726853544.77165: when evaluation is False, skipping this task 24160 1726853544.77167: _execute() done 24160 1726853544.77172: dumping result to json 24160 1726853544.77175: done dumping result, returning 24160 1726853544.77186: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [02083763-bbaf-5676-4eb4-00000000006a] 24160 1726853544.77190: sending task result for task 02083763-bbaf-5676-4eb4-00000000006a 24160 1726853544.77281: done sending task result for task 02083763-bbaf-5676-4eb4-00000000006a 24160 1726853544.77284: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 24160 1726853544.77358: no more pending results, returning what we have 24160 1726853544.77362: results queue empty 24160 1726853544.77363: checking for any_errors_fatal 24160 1726853544.77375: done checking for any_errors_fatal 24160 1726853544.77376: checking for max_fail_percentage 24160 1726853544.77377: done checking for max_fail_percentage 24160 1726853544.77378: checking to see if all hosts have failed and the running result is not ok 24160 1726853544.77379: done checking to see if all hosts have failed 24160 1726853544.77380: getting the remaining hosts for this loop 24160 1726853544.77381: done getting the remaining hosts for this loop 24160 1726853544.77385: getting the next task for host managed_node1 24160 1726853544.77390: done getting next task for host managed_node1 24160 1726853544.77394: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 24160 1726853544.77396: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853544.77408: getting variables 24160 1726853544.77410: in VariableManager get_vars() 24160 1726853544.77439: Calling all_inventory to load vars for managed_node1 24160 1726853544.77442: Calling groups_inventory to load vars for managed_node1 24160 1726853544.77443: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853544.77451: Calling all_plugins_play to load vars for managed_node1 24160 1726853544.77453: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853544.77456: Calling groups_plugins_play to load vars for managed_node1 24160 1726853544.78636: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853544.79968: done with get_vars() 24160 1726853544.79985: done getting variables 24160 1726853544.80025: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 13:32:24 -0400 (0:00:00.042) 0:00:21.203 ****** 24160 1726853544.80049: entering _queue_task() for managed_node1/fail 24160 1726853544.80279: worker is 1 (out of 1 available) 24160 1726853544.80293: exiting _queue_task() for managed_node1/fail 24160 1726853544.80306: done queuing things up, now waiting for results queue to drain 24160 1726853544.80308: waiting for pending results... 24160 1726853544.80483: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 24160 1726853544.80555: in run() - task 02083763-bbaf-5676-4eb4-00000000006b 24160 1726853544.80569: variable 'ansible_search_path' from source: unknown 24160 1726853544.80575: variable 'ansible_search_path' from source: unknown 24160 1726853544.80602: calling self._execute() 24160 1726853544.80681: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853544.80688: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853544.80696: variable 'omit' from source: magic vars 24160 1726853544.80975: variable 'ansible_distribution_major_version' from source: facts 24160 1726853544.80983: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853544.81053: variable 'connection_failed' from source: set_fact 24160 1726853544.81060: Evaluated conditional (not connection_failed): True 24160 1726853544.81137: variable 'ansible_distribution_major_version' from source: facts 24160 1726853544.81141: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853544.81211: variable 'connection_failed' from source: set_fact 24160 1726853544.81215: Evaluated conditional (not connection_failed): True 24160 1726853544.81331: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24160 1726853544.82820: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24160 1726853544.82869: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24160 1726853544.82896: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24160 1726853544.82921: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24160 1726853544.82941: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24160 1726853544.83000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853544.83020: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853544.83039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853544.83069: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853544.83081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853544.83147: variable 'ansible_distribution_major_version' from source: facts 24160 1726853544.83160: Evaluated conditional (ansible_distribution_major_version | int > 9): True 24160 1726853544.83235: variable 'ansible_distribution' from source: facts 24160 1726853544.83239: variable '__network_rh_distros' from source: role '' defaults 24160 1726853544.83246: Evaluated conditional (ansible_distribution in __network_rh_distros): True 24160 1726853544.83402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853544.83418: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853544.83436: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853544.83461: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853544.83475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853544.83508: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853544.83523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853544.83539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853544.83564: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853544.83578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853544.83606: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853544.83622: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853544.83638: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853544.83663: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853544.83675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853544.83860: variable 'network_connections' from source: play vars 24160 1726853544.83867: variable 'profile' from source: play vars 24160 1726853544.83917: variable 'profile' from source: play vars 24160 1726853544.83920: variable 'interface' from source: set_fact 24160 1726853544.83962: variable 'interface' from source: set_fact 24160 1726853544.83972: variable 'network_state' from source: role '' defaults 24160 1726853544.84023: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24160 1726853544.84141: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24160 1726853544.84168: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24160 1726853544.84194: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24160 1726853544.84215: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24160 1726853544.84247: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24160 1726853544.84264: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24160 1726853544.84283: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853544.84300: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24160 1726853544.84319: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 24160 1726853544.84322: when evaluation is False, skipping this task 24160 1726853544.84324: _execute() done 24160 1726853544.84327: dumping result to json 24160 1726853544.84329: done dumping result, returning 24160 1726853544.84337: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [02083763-bbaf-5676-4eb4-00000000006b] 24160 1726853544.84341: sending task result for task 02083763-bbaf-5676-4eb4-00000000006b 24160 1726853544.84425: done sending task result for task 02083763-bbaf-5676-4eb4-00000000006b 24160 1726853544.84428: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 24160 1726853544.84494: no more pending results, returning what we have 24160 1726853544.84497: results queue empty 24160 1726853544.84498: checking for any_errors_fatal 24160 1726853544.84503: done checking for any_errors_fatal 24160 1726853544.84504: checking for max_fail_percentage 24160 1726853544.84505: done checking for max_fail_percentage 24160 1726853544.84506: checking to see if all hosts have failed and the running result is not ok 24160 1726853544.84507: done checking to see if all hosts have failed 24160 1726853544.84508: getting the remaining hosts for this loop 24160 1726853544.84509: done getting the remaining hosts for this loop 24160 1726853544.84512: getting the next task for host managed_node1 24160 1726853544.84518: done getting next task for host managed_node1 24160 1726853544.84521: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 24160 1726853544.84523: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853544.84535: getting variables 24160 1726853544.84537: in VariableManager get_vars() 24160 1726853544.84577: Calling all_inventory to load vars for managed_node1 24160 1726853544.84579: Calling groups_inventory to load vars for managed_node1 24160 1726853544.84581: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853544.84590: Calling all_plugins_play to load vars for managed_node1 24160 1726853544.84593: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853544.84595: Calling groups_plugins_play to load vars for managed_node1 24160 1726853544.85399: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853544.86274: done with get_vars() 24160 1726853544.86291: done getting variables 24160 1726853544.86333: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 13:32:24 -0400 (0:00:00.063) 0:00:21.266 ****** 24160 1726853544.86356: entering _queue_task() for managed_node1/dnf 24160 1726853544.86595: worker is 1 (out of 1 available) 24160 1726853544.86610: exiting _queue_task() for managed_node1/dnf 24160 1726853544.86621: done queuing things up, now waiting for results queue to drain 24160 1726853544.86623: waiting for pending results... 24160 1726853544.86790: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 24160 1726853544.86856: in run() - task 02083763-bbaf-5676-4eb4-00000000006c 24160 1726853544.86867: variable 'ansible_search_path' from source: unknown 24160 1726853544.86870: variable 'ansible_search_path' from source: unknown 24160 1726853544.86909: calling self._execute() 24160 1726853544.86985: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853544.86990: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853544.86999: variable 'omit' from source: magic vars 24160 1726853544.87267: variable 'ansible_distribution_major_version' from source: facts 24160 1726853544.87279: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853544.87351: variable 'connection_failed' from source: set_fact 24160 1726853544.87358: Evaluated conditional (not connection_failed): True 24160 1726853544.87432: variable 'ansible_distribution_major_version' from source: facts 24160 1726853544.87436: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853544.87503: variable 'connection_failed' from source: set_fact 24160 1726853544.87507: Evaluated conditional (not connection_failed): True 24160 1726853544.87633: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24160 1726853544.89312: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24160 1726853544.89356: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24160 1726853544.89392: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24160 1726853544.89418: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24160 1726853544.89439: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24160 1726853544.89499: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853544.89519: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853544.89536: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853544.89562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853544.89574: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853544.89650: variable 'ansible_distribution' from source: facts 24160 1726853544.89656: variable 'ansible_distribution_major_version' from source: facts 24160 1726853544.89667: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 24160 1726853544.89743: variable '__network_wireless_connections_defined' from source: role '' defaults 24160 1726853544.89827: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853544.89844: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853544.89861: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853544.89891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853544.89900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853544.89930: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853544.89945: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853544.89962: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853544.89988: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853544.89999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853544.90028: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853544.90043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853544.90059: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853544.90085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853544.90096: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853544.90376: variable 'network_connections' from source: play vars 24160 1726853544.90380: variable 'profile' from source: play vars 24160 1726853544.90385: variable 'profile' from source: play vars 24160 1726853544.90388: variable 'interface' from source: set_fact 24160 1726853544.90390: variable 'interface' from source: set_fact 24160 1726853544.90497: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24160 1726853544.90690: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24160 1726853544.90741: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24160 1726853544.90785: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24160 1726853544.90850: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24160 1726853544.90900: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24160 1726853544.90928: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24160 1726853544.90974: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853544.91008: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24160 1726853544.91078: variable '__network_team_connections_defined' from source: role '' defaults 24160 1726853544.91339: variable 'network_connections' from source: play vars 24160 1726853544.91351: variable 'profile' from source: play vars 24160 1726853544.91429: variable 'profile' from source: play vars 24160 1726853544.91476: variable 'interface' from source: set_fact 24160 1726853544.91518: variable 'interface' from source: set_fact 24160 1726853544.91547: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 24160 1726853544.91559: when evaluation is False, skipping this task 24160 1726853544.91567: _execute() done 24160 1726853544.91575: dumping result to json 24160 1726853544.91582: done dumping result, returning 24160 1726853544.91603: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [02083763-bbaf-5676-4eb4-00000000006c] 24160 1726853544.91613: sending task result for task 02083763-bbaf-5676-4eb4-00000000006c skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 24160 1726853544.91828: no more pending results, returning what we have 24160 1726853544.91833: results queue empty 24160 1726853544.91834: checking for any_errors_fatal 24160 1726853544.91843: done checking for any_errors_fatal 24160 1726853544.91843: checking for max_fail_percentage 24160 1726853544.91846: done checking for max_fail_percentage 24160 1726853544.91847: checking to see if all hosts have failed and the running result is not ok 24160 1726853544.91848: done checking to see if all hosts have failed 24160 1726853544.91849: getting the remaining hosts for this loop 24160 1726853544.91850: done getting the remaining hosts for this loop 24160 1726853544.91857: getting the next task for host managed_node1 24160 1726853544.91863: done getting next task for host managed_node1 24160 1726853544.91867: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 24160 1726853544.91873: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853544.92084: getting variables 24160 1726853544.92086: in VariableManager get_vars() 24160 1726853544.92127: Calling all_inventory to load vars for managed_node1 24160 1726853544.92131: Calling groups_inventory to load vars for managed_node1 24160 1726853544.92134: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853544.92145: Calling all_plugins_play to load vars for managed_node1 24160 1726853544.92148: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853544.92151: Calling groups_plugins_play to load vars for managed_node1 24160 1726853544.92699: done sending task result for task 02083763-bbaf-5676-4eb4-00000000006c 24160 1726853544.92703: WORKER PROCESS EXITING 24160 1726853544.93946: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853544.95790: done with get_vars() 24160 1726853544.95813: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 24160 1726853544.95920: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 13:32:24 -0400 (0:00:00.095) 0:00:21.362 ****** 24160 1726853544.95952: entering _queue_task() for managed_node1/yum 24160 1726853544.96289: worker is 1 (out of 1 available) 24160 1726853544.96301: exiting _queue_task() for managed_node1/yum 24160 1726853544.96315: done queuing things up, now waiting for results queue to drain 24160 1726853544.96317: waiting for pending results... 24160 1726853544.96609: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 24160 1726853544.96715: in run() - task 02083763-bbaf-5676-4eb4-00000000006d 24160 1726853544.96736: variable 'ansible_search_path' from source: unknown 24160 1726853544.96745: variable 'ansible_search_path' from source: unknown 24160 1726853544.96792: calling self._execute() 24160 1726853544.96897: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853544.96911: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853544.96923: variable 'omit' from source: magic vars 24160 1726853544.97317: variable 'ansible_distribution_major_version' from source: facts 24160 1726853544.97339: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853544.97564: variable 'connection_failed' from source: set_fact 24160 1726853544.97568: Evaluated conditional (not connection_failed): True 24160 1726853544.97590: variable 'ansible_distribution_major_version' from source: facts 24160 1726853544.97602: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853544.97704: variable 'connection_failed' from source: set_fact 24160 1726853544.97714: Evaluated conditional (not connection_failed): True 24160 1726853544.98160: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24160 1726853545.01205: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24160 1726853545.01289: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24160 1726853545.01329: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24160 1726853545.01378: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24160 1726853545.01410: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24160 1726853545.01500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853545.01535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853545.01978: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853545.01982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853545.01984: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853545.02087: variable 'ansible_distribution_major_version' from source: facts 24160 1726853545.02090: Evaluated conditional (ansible_distribution_major_version | int < 8): False 24160 1726853545.02093: when evaluation is False, skipping this task 24160 1726853545.02097: _execute() done 24160 1726853545.02099: dumping result to json 24160 1726853545.02102: done dumping result, returning 24160 1726853545.02104: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [02083763-bbaf-5676-4eb4-00000000006d] 24160 1726853545.02107: sending task result for task 02083763-bbaf-5676-4eb4-00000000006d 24160 1726853545.02378: done sending task result for task 02083763-bbaf-5676-4eb4-00000000006d 24160 1726853545.02382: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 24160 1726853545.02437: no more pending results, returning what we have 24160 1726853545.02441: results queue empty 24160 1726853545.02442: checking for any_errors_fatal 24160 1726853545.02448: done checking for any_errors_fatal 24160 1726853545.02448: checking for max_fail_percentage 24160 1726853545.02451: done checking for max_fail_percentage 24160 1726853545.02452: checking to see if all hosts have failed and the running result is not ok 24160 1726853545.02452: done checking to see if all hosts have failed 24160 1726853545.02453: getting the remaining hosts for this loop 24160 1726853545.02455: done getting the remaining hosts for this loop 24160 1726853545.02459: getting the next task for host managed_node1 24160 1726853545.02466: done getting next task for host managed_node1 24160 1726853545.02469: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 24160 1726853545.02475: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853545.02489: getting variables 24160 1726853545.02490: in VariableManager get_vars() 24160 1726853545.02529: Calling all_inventory to load vars for managed_node1 24160 1726853545.02532: Calling groups_inventory to load vars for managed_node1 24160 1726853545.02535: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853545.02545: Calling all_plugins_play to load vars for managed_node1 24160 1726853545.02548: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853545.02551: Calling groups_plugins_play to load vars for managed_node1 24160 1726853545.04396: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853545.05959: done with get_vars() 24160 1726853545.05988: done getting variables 24160 1726853545.06040: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 13:32:25 -0400 (0:00:00.101) 0:00:21.463 ****** 24160 1726853545.06072: entering _queue_task() for managed_node1/fail 24160 1726853545.06404: worker is 1 (out of 1 available) 24160 1726853545.06416: exiting _queue_task() for managed_node1/fail 24160 1726853545.06429: done queuing things up, now waiting for results queue to drain 24160 1726853545.06431: waiting for pending results... 24160 1726853545.06704: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 24160 1726853545.06812: in run() - task 02083763-bbaf-5676-4eb4-00000000006e 24160 1726853545.06830: variable 'ansible_search_path' from source: unknown 24160 1726853545.06837: variable 'ansible_search_path' from source: unknown 24160 1726853545.06877: calling self._execute() 24160 1726853545.06981: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853545.07077: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853545.07080: variable 'omit' from source: magic vars 24160 1726853545.07385: variable 'ansible_distribution_major_version' from source: facts 24160 1726853545.07402: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853545.07514: variable 'connection_failed' from source: set_fact 24160 1726853545.07525: Evaluated conditional (not connection_failed): True 24160 1726853545.07635: variable 'ansible_distribution_major_version' from source: facts 24160 1726853545.07646: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853545.07745: variable 'connection_failed' from source: set_fact 24160 1726853545.07758: Evaluated conditional (not connection_failed): True 24160 1726853545.07864: variable '__network_wireless_connections_defined' from source: role '' defaults 24160 1726853545.08046: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24160 1726853545.10446: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24160 1726853545.10491: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24160 1726853545.10532: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24160 1726853545.10577: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24160 1726853545.10610: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24160 1726853545.10774: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853545.10778: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853545.10781: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853545.10802: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853545.10821: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853545.10869: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853545.10901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853545.10930: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853545.10987: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853545.10995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853545.11039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853545.11076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853545.11105: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853545.11203: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853545.11206: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853545.11308: variable 'network_connections' from source: play vars 24160 1726853545.11325: variable 'profile' from source: play vars 24160 1726853545.11398: variable 'profile' from source: play vars 24160 1726853545.11423: variable 'interface' from source: set_fact 24160 1726853545.11477: variable 'interface' from source: set_fact 24160 1726853545.11642: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24160 1726853545.11976: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24160 1726853545.11980: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24160 1726853545.12021: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24160 1726853545.12124: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24160 1726853545.12170: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24160 1726853545.12300: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24160 1726853545.12357: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853545.12391: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24160 1726853545.12530: variable '__network_team_connections_defined' from source: role '' defaults 24160 1726853545.13095: variable 'network_connections' from source: play vars 24160 1726853545.13098: variable 'profile' from source: play vars 24160 1726853545.13159: variable 'profile' from source: play vars 24160 1726853545.13211: variable 'interface' from source: set_fact 24160 1726853545.13422: variable 'interface' from source: set_fact 24160 1726853545.13425: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 24160 1726853545.13429: when evaluation is False, skipping this task 24160 1726853545.13439: _execute() done 24160 1726853545.13446: dumping result to json 24160 1726853545.13453: done dumping result, returning 24160 1726853545.13465: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [02083763-bbaf-5676-4eb4-00000000006e] 24160 1726853545.13477: sending task result for task 02083763-bbaf-5676-4eb4-00000000006e skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 24160 1726853545.13726: no more pending results, returning what we have 24160 1726853545.13730: results queue empty 24160 1726853545.13731: checking for any_errors_fatal 24160 1726853545.13737: done checking for any_errors_fatal 24160 1726853545.13738: checking for max_fail_percentage 24160 1726853545.13740: done checking for max_fail_percentage 24160 1726853545.13741: checking to see if all hosts have failed and the running result is not ok 24160 1726853545.13741: done checking to see if all hosts have failed 24160 1726853545.13742: getting the remaining hosts for this loop 24160 1726853545.13743: done getting the remaining hosts for this loop 24160 1726853545.13747: getting the next task for host managed_node1 24160 1726853545.13754: done getting next task for host managed_node1 24160 1726853545.13758: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 24160 1726853545.13760: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853545.13776: getting variables 24160 1726853545.13778: in VariableManager get_vars() 24160 1726853545.13817: Calling all_inventory to load vars for managed_node1 24160 1726853545.13820: Calling groups_inventory to load vars for managed_node1 24160 1726853545.13823: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853545.13835: Calling all_plugins_play to load vars for managed_node1 24160 1726853545.13838: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853545.13841: Calling groups_plugins_play to load vars for managed_node1 24160 1726853545.14545: done sending task result for task 02083763-bbaf-5676-4eb4-00000000006e 24160 1726853545.14548: WORKER PROCESS EXITING 24160 1726853545.16359: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853545.17943: done with get_vars() 24160 1726853545.17968: done getting variables 24160 1726853545.18031: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 13:32:25 -0400 (0:00:00.119) 0:00:21.583 ****** 24160 1726853545.18063: entering _queue_task() for managed_node1/package 24160 1726853545.18503: worker is 1 (out of 1 available) 24160 1726853545.18513: exiting _queue_task() for managed_node1/package 24160 1726853545.18524: done queuing things up, now waiting for results queue to drain 24160 1726853545.18526: waiting for pending results... 24160 1726853545.18709: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 24160 1726853545.18824: in run() - task 02083763-bbaf-5676-4eb4-00000000006f 24160 1726853545.18843: variable 'ansible_search_path' from source: unknown 24160 1726853545.18853: variable 'ansible_search_path' from source: unknown 24160 1726853545.18896: calling self._execute() 24160 1726853545.18997: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853545.19007: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853545.19019: variable 'omit' from source: magic vars 24160 1726853545.19389: variable 'ansible_distribution_major_version' from source: facts 24160 1726853545.19410: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853545.19520: variable 'connection_failed' from source: set_fact 24160 1726853545.19530: Evaluated conditional (not connection_failed): True 24160 1726853545.19641: variable 'ansible_distribution_major_version' from source: facts 24160 1726853545.19652: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853545.19750: variable 'connection_failed' from source: set_fact 24160 1726853545.19760: Evaluated conditional (not connection_failed): True 24160 1726853545.19945: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24160 1726853545.20376: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24160 1726853545.20379: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24160 1726853545.20382: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24160 1726853545.20385: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24160 1726853545.20463: variable 'network_packages' from source: role '' defaults 24160 1726853545.20573: variable '__network_provider_setup' from source: role '' defaults 24160 1726853545.20589: variable '__network_service_name_default_nm' from source: role '' defaults 24160 1726853545.20663: variable '__network_service_name_default_nm' from source: role '' defaults 24160 1726853545.20680: variable '__network_packages_default_nm' from source: role '' defaults 24160 1726853545.20747: variable '__network_packages_default_nm' from source: role '' defaults 24160 1726853545.20939: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24160 1726853545.23779: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24160 1726853545.23783: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24160 1726853545.23903: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24160 1726853545.23943: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24160 1726853545.24005: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24160 1726853545.24154: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853545.24250: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853545.24347: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853545.24413: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853545.24493: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853545.24592: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853545.24864: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853545.24867: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853545.24870: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853545.24874: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853545.25394: variable '__network_packages_default_gobject_packages' from source: role '' defaults 24160 1726853545.25701: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853545.25730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853545.25830: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853545.25923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853545.26010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853545.26190: variable 'ansible_python' from source: facts 24160 1726853545.26351: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 24160 1726853545.26579: variable '__network_wpa_supplicant_required' from source: role '' defaults 24160 1726853545.26992: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 24160 1726853545.27169: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853545.27488: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853545.27492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853545.27502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853545.27505: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853545.27553: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853545.27587: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853545.27630: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853545.27688: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853545.27716: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853545.27889: variable 'network_connections' from source: play vars 24160 1726853545.27901: variable 'profile' from source: play vars 24160 1726853545.28018: variable 'profile' from source: play vars 24160 1726853545.28046: variable 'interface' from source: set_fact 24160 1726853545.28125: variable 'interface' from source: set_fact 24160 1726853545.28224: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24160 1726853545.28266: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24160 1726853545.28302: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853545.28334: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24160 1726853545.28396: variable '__network_wireless_connections_defined' from source: role '' defaults 24160 1726853545.28789: variable 'network_connections' from source: play vars 24160 1726853545.28795: variable 'profile' from source: play vars 24160 1726853545.28902: variable 'profile' from source: play vars 24160 1726853545.28976: variable 'interface' from source: set_fact 24160 1726853545.29004: variable 'interface' from source: set_fact 24160 1726853545.29044: variable '__network_packages_default_wireless' from source: role '' defaults 24160 1726853545.29150: variable '__network_wireless_connections_defined' from source: role '' defaults 24160 1726853545.29878: variable 'network_connections' from source: play vars 24160 1726853545.29881: variable 'profile' from source: play vars 24160 1726853545.29884: variable 'profile' from source: play vars 24160 1726853545.29886: variable 'interface' from source: set_fact 24160 1726853545.30072: variable 'interface' from source: set_fact 24160 1726853545.30119: variable '__network_packages_default_team' from source: role '' defaults 24160 1726853545.30257: variable '__network_team_connections_defined' from source: role '' defaults 24160 1726853545.31024: variable 'network_connections' from source: play vars 24160 1726853545.31036: variable 'profile' from source: play vars 24160 1726853545.31102: variable 'profile' from source: play vars 24160 1726853545.31378: variable 'interface' from source: set_fact 24160 1726853545.31386: variable 'interface' from source: set_fact 24160 1726853545.31443: variable '__network_service_name_default_initscripts' from source: role '' defaults 24160 1726853545.31778: variable '__network_service_name_default_initscripts' from source: role '' defaults 24160 1726853545.31782: variable '__network_packages_default_initscripts' from source: role '' defaults 24160 1726853545.31805: variable '__network_packages_default_initscripts' from source: role '' defaults 24160 1726853545.32218: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 24160 1726853545.33700: variable 'network_connections' from source: play vars 24160 1726853545.33711: variable 'profile' from source: play vars 24160 1726853545.33783: variable 'profile' from source: play vars 24160 1726853545.33905: variable 'interface' from source: set_fact 24160 1726853545.33975: variable 'interface' from source: set_fact 24160 1726853545.33990: variable 'ansible_distribution' from source: facts 24160 1726853545.34013: variable '__network_rh_distros' from source: role '' defaults 24160 1726853545.34085: variable 'ansible_distribution_major_version' from source: facts 24160 1726853545.34104: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 24160 1726853545.34460: variable 'ansible_distribution' from source: facts 24160 1726853545.34470: variable '__network_rh_distros' from source: role '' defaults 24160 1726853545.34484: variable 'ansible_distribution_major_version' from source: facts 24160 1726853545.34502: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 24160 1726853545.34977: variable 'ansible_distribution' from source: facts 24160 1726853545.34981: variable '__network_rh_distros' from source: role '' defaults 24160 1726853545.34983: variable 'ansible_distribution_major_version' from source: facts 24160 1726853545.34985: variable 'network_provider' from source: set_fact 24160 1726853545.34987: variable 'ansible_facts' from source: unknown 24160 1726853545.36675: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 24160 1726853545.36685: when evaluation is False, skipping this task 24160 1726853545.36764: _execute() done 24160 1726853545.36767: dumping result to json 24160 1726853545.36770: done dumping result, returning 24160 1726853545.36775: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [02083763-bbaf-5676-4eb4-00000000006f] 24160 1726853545.36778: sending task result for task 02083763-bbaf-5676-4eb4-00000000006f 24160 1726853545.37050: done sending task result for task 02083763-bbaf-5676-4eb4-00000000006f 24160 1726853545.37056: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 24160 1726853545.37131: no more pending results, returning what we have 24160 1726853545.37135: results queue empty 24160 1726853545.37137: checking for any_errors_fatal 24160 1726853545.37143: done checking for any_errors_fatal 24160 1726853545.37143: checking for max_fail_percentage 24160 1726853545.37145: done checking for max_fail_percentage 24160 1726853545.37146: checking to see if all hosts have failed and the running result is not ok 24160 1726853545.37147: done checking to see if all hosts have failed 24160 1726853545.37148: getting the remaining hosts for this loop 24160 1726853545.37149: done getting the remaining hosts for this loop 24160 1726853545.37156: getting the next task for host managed_node1 24160 1726853545.37162: done getting next task for host managed_node1 24160 1726853545.37166: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 24160 1726853545.37176: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853545.37192: getting variables 24160 1726853545.37193: in VariableManager get_vars() 24160 1726853545.37232: Calling all_inventory to load vars for managed_node1 24160 1726853545.37236: Calling groups_inventory to load vars for managed_node1 24160 1726853545.37239: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853545.37249: Calling all_plugins_play to load vars for managed_node1 24160 1726853545.37253: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853545.37259: Calling groups_plugins_play to load vars for managed_node1 24160 1726853545.41105: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853545.46681: done with get_vars() 24160 1726853545.46714: done getting variables 24160 1726853545.46987: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 13:32:25 -0400 (0:00:00.289) 0:00:21.872 ****** 24160 1726853545.47021: entering _queue_task() for managed_node1/package 24160 1726853545.48003: worker is 1 (out of 1 available) 24160 1726853545.48017: exiting _queue_task() for managed_node1/package 24160 1726853545.48030: done queuing things up, now waiting for results queue to drain 24160 1726853545.48032: waiting for pending results... 24160 1726853545.48792: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 24160 1726853545.48798: in run() - task 02083763-bbaf-5676-4eb4-000000000070 24160 1726853545.48802: variable 'ansible_search_path' from source: unknown 24160 1726853545.48805: variable 'ansible_search_path' from source: unknown 24160 1726853545.49177: calling self._execute() 24160 1726853545.49181: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853545.49184: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853545.49187: variable 'omit' from source: magic vars 24160 1726853545.49898: variable 'ansible_distribution_major_version' from source: facts 24160 1726853545.50177: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853545.50203: variable 'connection_failed' from source: set_fact 24160 1726853545.50213: Evaluated conditional (not connection_failed): True 24160 1726853545.50310: variable 'ansible_distribution_major_version' from source: facts 24160 1726853545.50576: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853545.50581: variable 'connection_failed' from source: set_fact 24160 1726853545.50591: Evaluated conditional (not connection_failed): True 24160 1726853545.50977: variable 'network_state' from source: role '' defaults 24160 1726853545.50981: Evaluated conditional (network_state != {}): False 24160 1726853545.50983: when evaluation is False, skipping this task 24160 1726853545.50986: _execute() done 24160 1726853545.50988: dumping result to json 24160 1726853545.50990: done dumping result, returning 24160 1726853545.50993: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [02083763-bbaf-5676-4eb4-000000000070] 24160 1726853545.50996: sending task result for task 02083763-bbaf-5676-4eb4-000000000070 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 24160 1726853545.51135: no more pending results, returning what we have 24160 1726853545.51139: results queue empty 24160 1726853545.51140: checking for any_errors_fatal 24160 1726853545.51146: done checking for any_errors_fatal 24160 1726853545.51147: checking for max_fail_percentage 24160 1726853545.51149: done checking for max_fail_percentage 24160 1726853545.51150: checking to see if all hosts have failed and the running result is not ok 24160 1726853545.51150: done checking to see if all hosts have failed 24160 1726853545.51151: getting the remaining hosts for this loop 24160 1726853545.51152: done getting the remaining hosts for this loop 24160 1726853545.51158: getting the next task for host managed_node1 24160 1726853545.51163: done getting next task for host managed_node1 24160 1726853545.51167: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 24160 1726853545.51169: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853545.51186: getting variables 24160 1726853545.51188: in VariableManager get_vars() 24160 1726853545.51228: Calling all_inventory to load vars for managed_node1 24160 1726853545.51232: Calling groups_inventory to load vars for managed_node1 24160 1726853545.51234: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853545.51246: Calling all_plugins_play to load vars for managed_node1 24160 1726853545.51250: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853545.51253: Calling groups_plugins_play to load vars for managed_node1 24160 1726853545.52090: done sending task result for task 02083763-bbaf-5676-4eb4-000000000070 24160 1726853545.52094: WORKER PROCESS EXITING 24160 1726853545.54034: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853545.57233: done with get_vars() 24160 1726853545.57265: done getting variables 24160 1726853545.57530: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 13:32:25 -0400 (0:00:00.105) 0:00:21.978 ****** 24160 1726853545.57565: entering _queue_task() for managed_node1/package 24160 1726853545.58529: worker is 1 (out of 1 available) 24160 1726853545.58542: exiting _queue_task() for managed_node1/package 24160 1726853545.58554: done queuing things up, now waiting for results queue to drain 24160 1726853545.58555: waiting for pending results... 24160 1726853545.59363: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 24160 1726853545.59468: in run() - task 02083763-bbaf-5676-4eb4-000000000071 24160 1726853545.60078: variable 'ansible_search_path' from source: unknown 24160 1726853545.60083: variable 'ansible_search_path' from source: unknown 24160 1726853545.60086: calling self._execute() 24160 1726853545.60089: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853545.60094: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853545.60097: variable 'omit' from source: magic vars 24160 1726853545.61240: variable 'ansible_distribution_major_version' from source: facts 24160 1726853545.61492: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853545.61608: variable 'connection_failed' from source: set_fact 24160 1726853545.62077: Evaluated conditional (not connection_failed): True 24160 1726853545.62477: variable 'ansible_distribution_major_version' from source: facts 24160 1726853545.62480: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853545.62484: variable 'connection_failed' from source: set_fact 24160 1726853545.62487: Evaluated conditional (not connection_failed): True 24160 1726853545.62910: variable 'network_state' from source: role '' defaults 24160 1726853545.62925: Evaluated conditional (network_state != {}): False 24160 1726853545.62934: when evaluation is False, skipping this task 24160 1726853545.62942: _execute() done 24160 1726853545.62949: dumping result to json 24160 1726853545.62960: done dumping result, returning 24160 1726853545.62975: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [02083763-bbaf-5676-4eb4-000000000071] 24160 1726853545.62986: sending task result for task 02083763-bbaf-5676-4eb4-000000000071 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 24160 1726853545.63151: no more pending results, returning what we have 24160 1726853545.63155: results queue empty 24160 1726853545.63157: checking for any_errors_fatal 24160 1726853545.63165: done checking for any_errors_fatal 24160 1726853545.63166: checking for max_fail_percentage 24160 1726853545.63168: done checking for max_fail_percentage 24160 1726853545.63169: checking to see if all hosts have failed and the running result is not ok 24160 1726853545.63169: done checking to see if all hosts have failed 24160 1726853545.63170: getting the remaining hosts for this loop 24160 1726853545.63173: done getting the remaining hosts for this loop 24160 1726853545.63177: getting the next task for host managed_node1 24160 1726853545.63184: done getting next task for host managed_node1 24160 1726853545.63188: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 24160 1726853545.63191: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853545.63206: getting variables 24160 1726853545.63208: in VariableManager get_vars() 24160 1726853545.63249: Calling all_inventory to load vars for managed_node1 24160 1726853545.63252: Calling groups_inventory to load vars for managed_node1 24160 1726853545.63255: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853545.63267: Calling all_plugins_play to load vars for managed_node1 24160 1726853545.63474: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853545.63481: done sending task result for task 02083763-bbaf-5676-4eb4-000000000071 24160 1726853545.63484: WORKER PROCESS EXITING 24160 1726853545.63489: Calling groups_plugins_play to load vars for managed_node1 24160 1726853545.67941: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853545.71860: done with get_vars() 24160 1726853545.71894: done getting variables 24160 1726853545.71954: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 13:32:25 -0400 (0:00:00.146) 0:00:22.124 ****** 24160 1726853545.72192: entering _queue_task() for managed_node1/service 24160 1726853545.72757: worker is 1 (out of 1 available) 24160 1726853545.72769: exiting _queue_task() for managed_node1/service 24160 1726853545.73082: done queuing things up, now waiting for results queue to drain 24160 1726853545.73084: waiting for pending results... 24160 1726853545.73280: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 24160 1726853545.73583: in run() - task 02083763-bbaf-5676-4eb4-000000000072 24160 1726853545.73602: variable 'ansible_search_path' from source: unknown 24160 1726853545.73610: variable 'ansible_search_path' from source: unknown 24160 1726853545.73675: calling self._execute() 24160 1726853545.73962: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853545.73978: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853545.73992: variable 'omit' from source: magic vars 24160 1726853545.74870: variable 'ansible_distribution_major_version' from source: facts 24160 1726853545.74957: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853545.75182: variable 'connection_failed' from source: set_fact 24160 1726853545.75192: Evaluated conditional (not connection_failed): True 24160 1726853545.75412: variable 'ansible_distribution_major_version' from source: facts 24160 1726853545.75423: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853545.75624: variable 'connection_failed' from source: set_fact 24160 1726853545.75634: Evaluated conditional (not connection_failed): True 24160 1726853545.75935: variable '__network_wireless_connections_defined' from source: role '' defaults 24160 1726853545.76366: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24160 1726853545.81038: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24160 1726853545.81237: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24160 1726853545.81587: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24160 1726853545.81591: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24160 1726853545.81594: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24160 1726853545.81679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853545.81715: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853545.81742: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853545.81886: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853545.81910: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853545.82023: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853545.82140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853545.82166: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853545.82216: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853545.82293: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853545.82360: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853545.82578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853545.82581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853545.82584: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853545.82776: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853545.83037: variable 'network_connections' from source: play vars 24160 1726853545.83137: variable 'profile' from source: play vars 24160 1726853545.83220: variable 'profile' from source: play vars 24160 1726853545.83283: variable 'interface' from source: set_fact 24160 1726853545.83464: variable 'interface' from source: set_fact 24160 1726853545.83543: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24160 1726853545.83961: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24160 1726853545.84039: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24160 1726853545.84108: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24160 1726853545.84232: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24160 1726853545.84376: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24160 1726853545.84380: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24160 1726853545.84404: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853545.84778: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24160 1726853545.84782: variable '__network_team_connections_defined' from source: role '' defaults 24160 1726853545.85099: variable 'network_connections' from source: play vars 24160 1726853545.85276: variable 'profile' from source: play vars 24160 1726853545.85337: variable 'profile' from source: play vars 24160 1726853545.85346: variable 'interface' from source: set_fact 24160 1726853545.85483: variable 'interface' from source: set_fact 24160 1726853545.85514: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 24160 1726853545.85582: when evaluation is False, skipping this task 24160 1726853545.85591: _execute() done 24160 1726853545.85600: dumping result to json 24160 1726853545.85606: done dumping result, returning 24160 1726853545.85616: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [02083763-bbaf-5676-4eb4-000000000072] 24160 1726853545.85624: sending task result for task 02083763-bbaf-5676-4eb4-000000000072 24160 1726853545.85939: done sending task result for task 02083763-bbaf-5676-4eb4-000000000072 24160 1726853545.85943: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 24160 1726853545.85993: no more pending results, returning what we have 24160 1726853545.85997: results queue empty 24160 1726853545.85998: checking for any_errors_fatal 24160 1726853545.86004: done checking for any_errors_fatal 24160 1726853545.86004: checking for max_fail_percentage 24160 1726853545.86006: done checking for max_fail_percentage 24160 1726853545.86007: checking to see if all hosts have failed and the running result is not ok 24160 1726853545.86008: done checking to see if all hosts have failed 24160 1726853545.86009: getting the remaining hosts for this loop 24160 1726853545.86010: done getting the remaining hosts for this loop 24160 1726853545.86013: getting the next task for host managed_node1 24160 1726853545.86019: done getting next task for host managed_node1 24160 1726853545.86022: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 24160 1726853545.86024: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853545.86039: getting variables 24160 1726853545.86040: in VariableManager get_vars() 24160 1726853545.86082: Calling all_inventory to load vars for managed_node1 24160 1726853545.86085: Calling groups_inventory to load vars for managed_node1 24160 1726853545.86087: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853545.86099: Calling all_plugins_play to load vars for managed_node1 24160 1726853545.86102: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853545.86104: Calling groups_plugins_play to load vars for managed_node1 24160 1726853545.89136: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853545.92343: done with get_vars() 24160 1726853545.92377: done getting variables 24160 1726853545.92439: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 13:32:25 -0400 (0:00:00.202) 0:00:22.327 ****** 24160 1726853545.92673: entering _queue_task() for managed_node1/service 24160 1726853545.93235: worker is 1 (out of 1 available) 24160 1726853545.93247: exiting _queue_task() for managed_node1/service 24160 1726853545.93261: done queuing things up, now waiting for results queue to drain 24160 1726853545.93263: waiting for pending results... 24160 1726853545.93951: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 24160 1726853545.94047: in run() - task 02083763-bbaf-5676-4eb4-000000000073 24160 1726853545.94094: variable 'ansible_search_path' from source: unknown 24160 1726853545.94164: variable 'ansible_search_path' from source: unknown 24160 1726853545.94207: calling self._execute() 24160 1726853545.94479: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853545.94494: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853545.94509: variable 'omit' from source: magic vars 24160 1726853545.95289: variable 'ansible_distribution_major_version' from source: facts 24160 1726853545.95312: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853545.95535: variable 'connection_failed' from source: set_fact 24160 1726853545.95546: Evaluated conditional (not connection_failed): True 24160 1726853545.95732: variable 'ansible_distribution_major_version' from source: facts 24160 1726853545.95965: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853545.95969: variable 'connection_failed' from source: set_fact 24160 1726853545.96082: Evaluated conditional (not connection_failed): True 24160 1726853545.96351: variable 'network_provider' from source: set_fact 24160 1726853545.96365: variable 'network_state' from source: role '' defaults 24160 1726853545.96408: Evaluated conditional (network_provider == "nm" or network_state != {}): True 24160 1726853545.96420: variable 'omit' from source: magic vars 24160 1726853545.96516: variable 'omit' from source: magic vars 24160 1726853545.96550: variable 'network_service_name' from source: role '' defaults 24160 1726853545.96732: variable 'network_service_name' from source: role '' defaults 24160 1726853545.96939: variable '__network_provider_setup' from source: role '' defaults 24160 1726853545.97159: variable '__network_service_name_default_nm' from source: role '' defaults 24160 1726853545.97162: variable '__network_service_name_default_nm' from source: role '' defaults 24160 1726853545.97165: variable '__network_packages_default_nm' from source: role '' defaults 24160 1726853545.97296: variable '__network_packages_default_nm' from source: role '' defaults 24160 1726853545.97738: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24160 1726853546.02791: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24160 1726853546.02882: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24160 1726853546.02924: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24160 1726853546.02973: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24160 1726853546.03007: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24160 1726853546.03096: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853546.03129: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853546.03167: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853546.03214: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853546.03232: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853546.03290: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853546.03317: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853546.03348: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853546.03400: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853546.03419: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853546.03669: variable '__network_packages_default_gobject_packages' from source: role '' defaults 24160 1726853546.03798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853546.03920: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853546.03927: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853546.03929: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853546.03932: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853546.04018: variable 'ansible_python' from source: facts 24160 1726853546.04053: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 24160 1726853546.04148: variable '__network_wpa_supplicant_required' from source: role '' defaults 24160 1726853546.04233: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 24160 1726853546.04388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853546.04418: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853546.04446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853546.04501: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853546.04527: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853546.04588: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853546.04615: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853546.04681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853546.04698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853546.04717: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853546.04875: variable 'network_connections' from source: play vars 24160 1726853546.04890: variable 'profile' from source: play vars 24160 1726853546.04977: variable 'profile' from source: play vars 24160 1726853546.04987: variable 'interface' from source: set_fact 24160 1726853546.05059: variable 'interface' from source: set_fact 24160 1726853546.05180: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24160 1726853546.05398: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24160 1726853546.05461: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24160 1726853546.05510: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24160 1726853546.05564: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24160 1726853546.05629: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24160 1726853546.05676: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24160 1726853546.05715: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853546.05753: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24160 1726853546.05968: variable '__network_wireless_connections_defined' from source: role '' defaults 24160 1726853546.06685: variable 'network_connections' from source: play vars 24160 1726853546.06688: variable 'profile' from source: play vars 24160 1726853546.06801: variable 'profile' from source: play vars 24160 1726853546.06812: variable 'interface' from source: set_fact 24160 1726853546.06879: variable 'interface' from source: set_fact 24160 1726853546.07014: variable '__network_packages_default_wireless' from source: role '' defaults 24160 1726853546.07196: variable '__network_wireless_connections_defined' from source: role '' defaults 24160 1726853546.07818: variable 'network_connections' from source: play vars 24160 1726853546.07828: variable 'profile' from source: play vars 24160 1726853546.07916: variable 'profile' from source: play vars 24160 1726853546.07927: variable 'interface' from source: set_fact 24160 1726853546.08011: variable 'interface' from source: set_fact 24160 1726853546.08042: variable '__network_packages_default_team' from source: role '' defaults 24160 1726853546.08133: variable '__network_team_connections_defined' from source: role '' defaults 24160 1726853546.08451: variable 'network_connections' from source: play vars 24160 1726853546.08465: variable 'profile' from source: play vars 24160 1726853546.08547: variable 'profile' from source: play vars 24160 1726853546.08560: variable 'interface' from source: set_fact 24160 1726853546.08645: variable 'interface' from source: set_fact 24160 1726853546.08711: variable '__network_service_name_default_initscripts' from source: role '' defaults 24160 1726853546.08786: variable '__network_service_name_default_initscripts' from source: role '' defaults 24160 1726853546.08798: variable '__network_packages_default_initscripts' from source: role '' defaults 24160 1726853546.08867: variable '__network_packages_default_initscripts' from source: role '' defaults 24160 1726853546.09102: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 24160 1726853546.09977: variable 'network_connections' from source: play vars 24160 1726853546.09980: variable 'profile' from source: play vars 24160 1726853546.09982: variable 'profile' from source: play vars 24160 1726853546.09984: variable 'interface' from source: set_fact 24160 1726853546.10487: variable 'interface' from source: set_fact 24160 1726853546.10490: variable 'ansible_distribution' from source: facts 24160 1726853546.10492: variable '__network_rh_distros' from source: role '' defaults 24160 1726853546.10495: variable 'ansible_distribution_major_version' from source: facts 24160 1726853546.10497: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 24160 1726853546.11177: variable 'ansible_distribution' from source: facts 24160 1726853546.11180: variable '__network_rh_distros' from source: role '' defaults 24160 1726853546.11182: variable 'ansible_distribution_major_version' from source: facts 24160 1726853546.11184: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 24160 1726853546.11460: variable 'ansible_distribution' from source: facts 24160 1726853546.11527: variable '__network_rh_distros' from source: role '' defaults 24160 1726853546.11538: variable 'ansible_distribution_major_version' from source: facts 24160 1726853546.11613: variable 'network_provider' from source: set_fact 24160 1726853546.11655: variable 'omit' from source: magic vars 24160 1726853546.11768: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24160 1726853546.11803: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24160 1726853546.11865: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24160 1726853546.11890: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853546.12063: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853546.12067: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24160 1726853546.12069: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853546.12073: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853546.12214: Set connection var ansible_shell_executable to /bin/sh 24160 1726853546.12287: Set connection var ansible_pipelining to False 24160 1726853546.12294: Set connection var ansible_connection to ssh 24160 1726853546.12302: Set connection var ansible_shell_type to sh 24160 1726853546.12314: Set connection var ansible_module_compression to ZIP_DEFLATED 24160 1726853546.12327: Set connection var ansible_timeout to 10 24160 1726853546.12360: variable 'ansible_shell_executable' from source: unknown 24160 1726853546.12391: variable 'ansible_connection' from source: unknown 24160 1726853546.12576: variable 'ansible_module_compression' from source: unknown 24160 1726853546.12579: variable 'ansible_shell_type' from source: unknown 24160 1726853546.12581: variable 'ansible_shell_executable' from source: unknown 24160 1726853546.12584: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853546.12586: variable 'ansible_pipelining' from source: unknown 24160 1726853546.12588: variable 'ansible_timeout' from source: unknown 24160 1726853546.12590: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853546.12777: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 24160 1726853546.12780: variable 'omit' from source: magic vars 24160 1726853546.12783: starting attempt loop 24160 1726853546.12785: running the handler 24160 1726853546.12970: variable 'ansible_facts' from source: unknown 24160 1726853546.14401: _low_level_execute_command(): starting 24160 1726853546.14416: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24160 1726853546.15920: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853546.15974: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 24160 1726853546.16144: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853546.16486: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853546.16563: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853546.18477: stdout chunk (state=3): >>>/root <<< 24160 1726853546.18520: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853546.18526: stdout chunk (state=3): >>><<< 24160 1726853546.18534: stderr chunk (state=3): >>><<< 24160 1726853546.18556: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853546.18569: _low_level_execute_command(): starting 24160 1726853546.18576: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853546.1855426-25252-261907244521229 `" && echo ansible-tmp-1726853546.1855426-25252-261907244521229="` echo /root/.ansible/tmp/ansible-tmp-1726853546.1855426-25252-261907244521229 `" ) && sleep 0' 24160 1726853546.19876: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853546.19879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853546.19882: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853546.19884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853546.19889: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853546.19965: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853546.20185: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853546.22257: stdout chunk (state=3): >>>ansible-tmp-1726853546.1855426-25252-261907244521229=/root/.ansible/tmp/ansible-tmp-1726853546.1855426-25252-261907244521229 <<< 24160 1726853546.22261: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853546.22263: stderr chunk (state=3): >>><<< 24160 1726853546.22266: stdout chunk (state=3): >>><<< 24160 1726853546.22413: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853546.1855426-25252-261907244521229=/root/.ansible/tmp/ansible-tmp-1726853546.1855426-25252-261907244521229 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853546.22422: variable 'ansible_module_compression' from source: unknown 24160 1726853546.22439: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24160jdl187cr/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 24160 1726853546.22689: variable 'ansible_facts' from source: unknown 24160 1726853546.23080: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853546.1855426-25252-261907244521229/AnsiballZ_systemd.py 24160 1726853546.23410: Sending initial data 24160 1726853546.23413: Sent initial data (156 bytes) 24160 1726853546.24432: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853546.24540: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853546.24543: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853546.24567: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853546.24700: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853546.26242: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24160 1726853546.26354: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24160 1726853546.26422: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24160jdl187cr/tmpfpcajdjd /root/.ansible/tmp/ansible-tmp-1726853546.1855426-25252-261907244521229/AnsiballZ_systemd.py <<< 24160 1726853546.26425: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853546.1855426-25252-261907244521229/AnsiballZ_systemd.py" <<< 24160 1726853546.26485: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24160jdl187cr/tmpfpcajdjd" to remote "/root/.ansible/tmp/ansible-tmp-1726853546.1855426-25252-261907244521229/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853546.1855426-25252-261907244521229/AnsiballZ_systemd.py" <<< 24160 1726853546.29548: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853546.29552: stderr chunk (state=3): >>><<< 24160 1726853546.29554: stdout chunk (state=3): >>><<< 24160 1726853546.29588: done transferring module to remote 24160 1726853546.29601: _low_level_execute_command(): starting 24160 1726853546.29604: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853546.1855426-25252-261907244521229/ /root/.ansible/tmp/ansible-tmp-1726853546.1855426-25252-261907244521229/AnsiballZ_systemd.py && sleep 0' 24160 1726853546.30839: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853546.30847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24160 1726853546.30963: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853546.30966: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 24160 1726853546.30973: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853546.31086: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853546.31322: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853546.32942: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853546.32945: stderr chunk (state=3): >>><<< 24160 1726853546.32948: stdout chunk (state=3): >>><<< 24160 1726853546.32966: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853546.32969: _low_level_execute_command(): starting 24160 1726853546.32988: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853546.1855426-25252-261907244521229/AnsiballZ_systemd.py && sleep 0' 24160 1726853546.34067: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853546.34150: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853546.34167: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853546.34190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24160 1726853546.34255: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853546.34403: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853546.34424: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853546.34501: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853546.63730: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ExecMainStartTimestampMonotonic": "13747067", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ExecMainHandoffTimestampMonotonic": "13825256", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "10682368", "MemoryPeak": "14561280", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3323711488", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "1095745000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service shutdown.target multi-user.target network.target cloud-init.service", "After": "cloud-init-local.service systemd-journald.socket sysinit.target dbus.socket dbus-broker.service basic.target system.slice network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:27:21 EDT", "StateChangeTimestampMonotonic": "407641563", "InactiveExitTimestamp": "Fri 2024-09-20 13:20:47 EDT", "InactiveExitTimestampMonotonic": "13748890", "ActiveEnterTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ActiveEnterTimestampMonotonic": "14166608", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ConditionTimestampMonotonic": "13745559", "AssertTimestamp": "Fri 2024-09-20 13:20:47 EDT", "AssertTimestampMonotonic": "13745562", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "5f58decfa480494eac8aa3993b4c7ec8", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 24160 1726853546.65738: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 24160 1726853546.65775: stdout chunk (state=3): >>><<< 24160 1726853546.65902: stderr chunk (state=3): >>><<< 24160 1726853546.66128: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ExecMainStartTimestampMonotonic": "13747067", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ExecMainHandoffTimestampMonotonic": "13825256", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "10682368", "MemoryPeak": "14561280", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3323711488", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "1095745000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service shutdown.target multi-user.target network.target cloud-init.service", "After": "cloud-init-local.service systemd-journald.socket sysinit.target dbus.socket dbus-broker.service basic.target system.slice network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:27:21 EDT", "StateChangeTimestampMonotonic": "407641563", "InactiveExitTimestamp": "Fri 2024-09-20 13:20:47 EDT", "InactiveExitTimestampMonotonic": "13748890", "ActiveEnterTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ActiveEnterTimestampMonotonic": "14166608", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ConditionTimestampMonotonic": "13745559", "AssertTimestamp": "Fri 2024-09-20 13:20:47 EDT", "AssertTimestampMonotonic": "13745562", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "5f58decfa480494eac8aa3993b4c7ec8", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 24160 1726853546.66758: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853546.1855426-25252-261907244521229/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24160 1726853546.66762: _low_level_execute_command(): starting 24160 1726853546.66765: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853546.1855426-25252-261907244521229/ > /dev/null 2>&1 && sleep 0' 24160 1726853546.68396: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853546.68410: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853546.68467: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853546.68514: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853546.68860: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853546.70794: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853546.70798: stdout chunk (state=3): >>><<< 24160 1726853546.70800: stderr chunk (state=3): >>><<< 24160 1726853546.70810: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853546.70817: handler run complete 24160 1726853546.70901: attempt loop complete, returning result 24160 1726853546.70904: _execute() done 24160 1726853546.70907: dumping result to json 24160 1726853546.70909: done dumping result, returning 24160 1726853546.70917: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [02083763-bbaf-5676-4eb4-000000000073] 24160 1726853546.70919: sending task result for task 02083763-bbaf-5676-4eb4-000000000073 24160 1726853546.71701: done sending task result for task 02083763-bbaf-5676-4eb4-000000000073 24160 1726853546.71704: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 24160 1726853546.71758: no more pending results, returning what we have 24160 1726853546.71761: results queue empty 24160 1726853546.71763: checking for any_errors_fatal 24160 1726853546.71773: done checking for any_errors_fatal 24160 1726853546.71774: checking for max_fail_percentage 24160 1726853546.71775: done checking for max_fail_percentage 24160 1726853546.71776: checking to see if all hosts have failed and the running result is not ok 24160 1726853546.71777: done checking to see if all hosts have failed 24160 1726853546.71784: getting the remaining hosts for this loop 24160 1726853546.71786: done getting the remaining hosts for this loop 24160 1726853546.71790: getting the next task for host managed_node1 24160 1726853546.71795: done getting next task for host managed_node1 24160 1726853546.71798: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 24160 1726853546.71801: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853546.71811: getting variables 24160 1726853546.71813: in VariableManager get_vars() 24160 1726853546.71848: Calling all_inventory to load vars for managed_node1 24160 1726853546.71851: Calling groups_inventory to load vars for managed_node1 24160 1726853546.71856: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853546.71866: Calling all_plugins_play to load vars for managed_node1 24160 1726853546.71869: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853546.71877: Calling groups_plugins_play to load vars for managed_node1 24160 1726853546.74585: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853546.76986: done with get_vars() 24160 1726853546.77013: done getting variables 24160 1726853546.77256: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 13:32:26 -0400 (0:00:00.848) 0:00:23.175 ****** 24160 1726853546.77291: entering _queue_task() for managed_node1/service 24160 1726853546.77696: worker is 1 (out of 1 available) 24160 1726853546.77709: exiting _queue_task() for managed_node1/service 24160 1726853546.77722: done queuing things up, now waiting for results queue to drain 24160 1726853546.77724: waiting for pending results... 24160 1726853546.78088: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 24160 1726853546.78377: in run() - task 02083763-bbaf-5676-4eb4-000000000074 24160 1726853546.78381: variable 'ansible_search_path' from source: unknown 24160 1726853546.78384: variable 'ansible_search_path' from source: unknown 24160 1726853546.78386: calling self._execute() 24160 1726853546.78389: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853546.78391: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853546.78394: variable 'omit' from source: magic vars 24160 1726853546.78779: variable 'ansible_distribution_major_version' from source: facts 24160 1726853546.78798: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853546.78911: variable 'connection_failed' from source: set_fact 24160 1726853546.78924: Evaluated conditional (not connection_failed): True 24160 1726853546.79037: variable 'ansible_distribution_major_version' from source: facts 24160 1726853546.79054: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853546.79157: variable 'connection_failed' from source: set_fact 24160 1726853546.79168: Evaluated conditional (not connection_failed): True 24160 1726853546.79265: variable 'network_provider' from source: set_fact 24160 1726853546.79279: Evaluated conditional (network_provider == "nm"): True 24160 1726853546.79386: variable '__network_wpa_supplicant_required' from source: role '' defaults 24160 1726853546.79461: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 24160 1726853546.79632: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24160 1726853546.83333: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24160 1726853546.83416: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24160 1726853546.83564: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24160 1726853546.83567: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24160 1726853546.83570: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24160 1726853546.83632: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853546.83664: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853546.83705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853546.83753: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853546.83779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853546.83829: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853546.83857: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853546.83891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853546.83934: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853546.83954: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853546.84006: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853546.84033: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853546.84059: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853546.84109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853546.84212: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853546.84286: variable 'network_connections' from source: play vars 24160 1726853546.84304: variable 'profile' from source: play vars 24160 1726853546.84391: variable 'profile' from source: play vars 24160 1726853546.84400: variable 'interface' from source: set_fact 24160 1726853546.84470: variable 'interface' from source: set_fact 24160 1726853546.84558: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24160 1726853546.84734: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24160 1726853546.84974: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24160 1726853546.84978: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24160 1726853546.84980: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24160 1726853546.84982: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24160 1726853546.85096: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24160 1726853546.85126: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853546.85155: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24160 1726853546.85233: variable '__network_wireless_connections_defined' from source: role '' defaults 24160 1726853546.86166: variable 'network_connections' from source: play vars 24160 1726853546.86188: variable 'profile' from source: play vars 24160 1726853546.86334: variable 'profile' from source: play vars 24160 1726853546.86344: variable 'interface' from source: set_fact 24160 1726853546.86575: variable 'interface' from source: set_fact 24160 1726853546.86579: Evaluated conditional (__network_wpa_supplicant_required): False 24160 1726853546.86581: when evaluation is False, skipping this task 24160 1726853546.86583: _execute() done 24160 1726853546.86584: dumping result to json 24160 1726853546.86586: done dumping result, returning 24160 1726853546.86590: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [02083763-bbaf-5676-4eb4-000000000074] 24160 1726853546.86592: sending task result for task 02083763-bbaf-5676-4eb4-000000000074 24160 1726853546.86884: done sending task result for task 02083763-bbaf-5676-4eb4-000000000074 24160 1726853546.86888: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 24160 1726853546.86965: no more pending results, returning what we have 24160 1726853546.86969: results queue empty 24160 1726853546.86970: checking for any_errors_fatal 24160 1726853546.86995: done checking for any_errors_fatal 24160 1726853546.86996: checking for max_fail_percentage 24160 1726853546.86998: done checking for max_fail_percentage 24160 1726853546.87000: checking to see if all hosts have failed and the running result is not ok 24160 1726853546.87000: done checking to see if all hosts have failed 24160 1726853546.87001: getting the remaining hosts for this loop 24160 1726853546.87002: done getting the remaining hosts for this loop 24160 1726853546.87006: getting the next task for host managed_node1 24160 1726853546.87012: done getting next task for host managed_node1 24160 1726853546.87016: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 24160 1726853546.87018: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853546.87033: getting variables 24160 1726853546.87035: in VariableManager get_vars() 24160 1726853546.87279: Calling all_inventory to load vars for managed_node1 24160 1726853546.87282: Calling groups_inventory to load vars for managed_node1 24160 1726853546.87285: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853546.87295: Calling all_plugins_play to load vars for managed_node1 24160 1726853546.87297: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853546.87300: Calling groups_plugins_play to load vars for managed_node1 24160 1726853546.90385: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853546.93614: done with get_vars() 24160 1726853546.93647: done getting variables 24160 1726853546.93916: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 13:32:26 -0400 (0:00:00.166) 0:00:23.342 ****** 24160 1726853546.93949: entering _queue_task() for managed_node1/service 24160 1726853546.94983: worker is 1 (out of 1 available) 24160 1726853546.94996: exiting _queue_task() for managed_node1/service 24160 1726853546.95013: done queuing things up, now waiting for results queue to drain 24160 1726853546.95015: waiting for pending results... 24160 1726853546.95793: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 24160 1726853546.95888: in run() - task 02083763-bbaf-5676-4eb4-000000000075 24160 1726853546.95892: variable 'ansible_search_path' from source: unknown 24160 1726853546.95894: variable 'ansible_search_path' from source: unknown 24160 1726853546.95897: calling self._execute() 24160 1726853546.96091: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853546.96211: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853546.96216: variable 'omit' from source: magic vars 24160 1726853546.97000: variable 'ansible_distribution_major_version' from source: facts 24160 1726853546.97021: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853546.97249: variable 'connection_failed' from source: set_fact 24160 1726853546.97476: Evaluated conditional (not connection_failed): True 24160 1726853546.97597: variable 'ansible_distribution_major_version' from source: facts 24160 1726853546.97634: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853546.97823: variable 'connection_failed' from source: set_fact 24160 1726853546.97836: Evaluated conditional (not connection_failed): True 24160 1726853546.98083: variable 'network_provider' from source: set_fact 24160 1726853546.98277: Evaluated conditional (network_provider == "initscripts"): False 24160 1726853546.98280: when evaluation is False, skipping this task 24160 1726853546.98285: _execute() done 24160 1726853546.98287: dumping result to json 24160 1726853546.98289: done dumping result, returning 24160 1726853546.98292: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [02083763-bbaf-5676-4eb4-000000000075] 24160 1726853546.98297: sending task result for task 02083763-bbaf-5676-4eb4-000000000075 24160 1726853546.98574: done sending task result for task 02083763-bbaf-5676-4eb4-000000000075 24160 1726853546.98578: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 24160 1726853546.98632: no more pending results, returning what we have 24160 1726853546.98637: results queue empty 24160 1726853546.98638: checking for any_errors_fatal 24160 1726853546.98650: done checking for any_errors_fatal 24160 1726853546.98651: checking for max_fail_percentage 24160 1726853546.98653: done checking for max_fail_percentage 24160 1726853546.98654: checking to see if all hosts have failed and the running result is not ok 24160 1726853546.98655: done checking to see if all hosts have failed 24160 1726853546.98656: getting the remaining hosts for this loop 24160 1726853546.98657: done getting the remaining hosts for this loop 24160 1726853546.98661: getting the next task for host managed_node1 24160 1726853546.98668: done getting next task for host managed_node1 24160 1726853546.98676: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 24160 1726853546.98681: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853546.98698: getting variables 24160 1726853546.98700: in VariableManager get_vars() 24160 1726853546.98745: Calling all_inventory to load vars for managed_node1 24160 1726853546.98749: Calling groups_inventory to load vars for managed_node1 24160 1726853546.98752: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853546.98767: Calling all_plugins_play to load vars for managed_node1 24160 1726853546.98976: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853546.98982: Calling groups_plugins_play to load vars for managed_node1 24160 1726853547.02229: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853547.05514: done with get_vars() 24160 1726853547.05660: done getting variables 24160 1726853547.05731: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 13:32:27 -0400 (0:00:00.119) 0:00:23.461 ****** 24160 1726853547.05885: entering _queue_task() for managed_node1/copy 24160 1726853547.06640: worker is 1 (out of 1 available) 24160 1726853547.06654: exiting _queue_task() for managed_node1/copy 24160 1726853547.06669: done queuing things up, now waiting for results queue to drain 24160 1726853547.06673: waiting for pending results... 24160 1726853547.07185: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 24160 1726853547.07832: in run() - task 02083763-bbaf-5676-4eb4-000000000076 24160 1726853547.07856: variable 'ansible_search_path' from source: unknown 24160 1726853547.07860: variable 'ansible_search_path' from source: unknown 24160 1726853547.08236: calling self._execute() 24160 1726853547.08650: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853547.08654: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853547.08799: variable 'omit' from source: magic vars 24160 1726853547.10057: variable 'ansible_distribution_major_version' from source: facts 24160 1726853547.10061: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853547.10287: variable 'connection_failed' from source: set_fact 24160 1726853547.10292: Evaluated conditional (not connection_failed): True 24160 1726853547.10867: variable 'ansible_distribution_major_version' from source: facts 24160 1726853547.10878: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853547.11265: variable 'connection_failed' from source: set_fact 24160 1726853547.11272: Evaluated conditional (not connection_failed): True 24160 1726853547.11687: variable 'network_provider' from source: set_fact 24160 1726853547.11691: Evaluated conditional (network_provider == "initscripts"): False 24160 1726853547.11693: when evaluation is False, skipping this task 24160 1726853547.11698: _execute() done 24160 1726853547.11701: dumping result to json 24160 1726853547.11709: done dumping result, returning 24160 1726853547.11719: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [02083763-bbaf-5676-4eb4-000000000076] 24160 1726853547.11724: sending task result for task 02083763-bbaf-5676-4eb4-000000000076 24160 1726853547.11958: done sending task result for task 02083763-bbaf-5676-4eb4-000000000076 24160 1726853547.11961: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 24160 1726853547.12046: no more pending results, returning what we have 24160 1726853547.12051: results queue empty 24160 1726853547.12052: checking for any_errors_fatal 24160 1726853547.12062: done checking for any_errors_fatal 24160 1726853547.12063: checking for max_fail_percentage 24160 1726853547.12065: done checking for max_fail_percentage 24160 1726853547.12066: checking to see if all hosts have failed and the running result is not ok 24160 1726853547.12067: done checking to see if all hosts have failed 24160 1726853547.12067: getting the remaining hosts for this loop 24160 1726853547.12069: done getting the remaining hosts for this loop 24160 1726853547.12074: getting the next task for host managed_node1 24160 1726853547.12081: done getting next task for host managed_node1 24160 1726853547.12085: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 24160 1726853547.12088: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853547.12104: getting variables 24160 1726853547.12106: in VariableManager get_vars() 24160 1726853547.12147: Calling all_inventory to load vars for managed_node1 24160 1726853547.12150: Calling groups_inventory to load vars for managed_node1 24160 1726853547.12152: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853547.12163: Calling all_plugins_play to load vars for managed_node1 24160 1726853547.12166: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853547.12168: Calling groups_plugins_play to load vars for managed_node1 24160 1726853547.14613: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853547.16431: done with get_vars() 24160 1726853547.16467: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 13:32:27 -0400 (0:00:00.106) 0:00:23.568 ****** 24160 1726853547.16561: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 24160 1726853547.17391: worker is 1 (out of 1 available) 24160 1726853547.17402: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 24160 1726853547.17413: done queuing things up, now waiting for results queue to drain 24160 1726853547.17414: waiting for pending results... 24160 1726853547.17798: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 24160 1726853547.17806: in run() - task 02083763-bbaf-5676-4eb4-000000000077 24160 1726853547.17810: variable 'ansible_search_path' from source: unknown 24160 1726853547.17812: variable 'ansible_search_path' from source: unknown 24160 1726853547.17914: calling self._execute() 24160 1726853547.18085: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853547.18089: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853547.18144: variable 'omit' from source: magic vars 24160 1726853547.18829: variable 'ansible_distribution_major_version' from source: facts 24160 1726853547.18833: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853547.18837: variable 'connection_failed' from source: set_fact 24160 1726853547.18840: Evaluated conditional (not connection_failed): True 24160 1726853547.19063: variable 'ansible_distribution_major_version' from source: facts 24160 1726853547.19067: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853547.19227: variable 'connection_failed' from source: set_fact 24160 1726853547.19238: Evaluated conditional (not connection_failed): True 24160 1726853547.19245: variable 'omit' from source: magic vars 24160 1726853547.19315: variable 'omit' from source: magic vars 24160 1726853547.19532: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24160 1726853547.22097: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24160 1726853547.22200: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24160 1726853547.22246: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24160 1726853547.22285: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24160 1726853547.22313: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24160 1726853547.22426: variable 'network_provider' from source: set_fact 24160 1726853547.22624: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853547.22651: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853547.22741: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853547.22744: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853547.22747: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853547.22832: variable 'omit' from source: magic vars 24160 1726853547.22972: variable 'omit' from source: magic vars 24160 1726853547.23082: variable 'network_connections' from source: play vars 24160 1726853547.23094: variable 'profile' from source: play vars 24160 1726853547.23160: variable 'profile' from source: play vars 24160 1726853547.23163: variable 'interface' from source: set_fact 24160 1726853547.23276: variable 'interface' from source: set_fact 24160 1726853547.23439: variable 'omit' from source: magic vars 24160 1726853547.23462: variable '__lsr_ansible_managed' from source: task vars 24160 1726853547.23600: variable '__lsr_ansible_managed' from source: task vars 24160 1726853547.24002: Loaded config def from plugin (lookup/template) 24160 1726853547.24006: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 24160 1726853547.24030: File lookup term: get_ansible_managed.j2 24160 1726853547.24033: variable 'ansible_search_path' from source: unknown 24160 1726853547.24078: evaluation_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 24160 1726853547.24277: search_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 24160 1726853547.24280: variable 'ansible_search_path' from source: unknown 24160 1726853547.32063: variable 'ansible_managed' from source: unknown 24160 1726853547.32196: variable 'omit' from source: magic vars 24160 1726853547.32236: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24160 1726853547.32266: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24160 1726853547.32291: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24160 1726853547.32315: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853547.32333: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853547.32365: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24160 1726853547.32377: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853547.32386: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853547.32488: Set connection var ansible_shell_executable to /bin/sh 24160 1726853547.32539: Set connection var ansible_pipelining to False 24160 1726853547.32543: Set connection var ansible_connection to ssh 24160 1726853547.32545: Set connection var ansible_shell_type to sh 24160 1726853547.32548: Set connection var ansible_module_compression to ZIP_DEFLATED 24160 1726853547.32550: Set connection var ansible_timeout to 10 24160 1726853547.32565: variable 'ansible_shell_executable' from source: unknown 24160 1726853547.32568: variable 'ansible_connection' from source: unknown 24160 1726853547.32572: variable 'ansible_module_compression' from source: unknown 24160 1726853547.32575: variable 'ansible_shell_type' from source: unknown 24160 1726853547.32578: variable 'ansible_shell_executable' from source: unknown 24160 1726853547.32587: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853547.32589: variable 'ansible_pipelining' from source: unknown 24160 1726853547.32591: variable 'ansible_timeout' from source: unknown 24160 1726853547.32594: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853547.32723: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 24160 1726853547.32738: variable 'omit' from source: magic vars 24160 1726853547.32757: starting attempt loop 24160 1726853547.32760: running the handler 24160 1726853547.32762: _low_level_execute_command(): starting 24160 1726853547.32765: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24160 1726853547.33549: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853547.33632: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853547.33697: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853547.35398: stdout chunk (state=3): >>>/root <<< 24160 1726853547.35544: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853547.35547: stdout chunk (state=3): >>><<< 24160 1726853547.35550: stderr chunk (state=3): >>><<< 24160 1726853547.35573: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853547.35592: _low_level_execute_command(): starting 24160 1726853547.35669: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853547.3558092-25300-50680159654956 `" && echo ansible-tmp-1726853547.3558092-25300-50680159654956="` echo /root/.ansible/tmp/ansible-tmp-1726853547.3558092-25300-50680159654956 `" ) && sleep 0' 24160 1726853547.36394: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853547.36397: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24160 1726853547.36407: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 24160 1726853547.36481: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853547.36516: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853547.36554: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853547.36607: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853547.38478: stdout chunk (state=3): >>>ansible-tmp-1726853547.3558092-25300-50680159654956=/root/.ansible/tmp/ansible-tmp-1726853547.3558092-25300-50680159654956 <<< 24160 1726853547.38619: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853547.38624: stdout chunk (state=3): >>><<< 24160 1726853547.38627: stderr chunk (state=3): >>><<< 24160 1726853547.38646: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853547.3558092-25300-50680159654956=/root/.ansible/tmp/ansible-tmp-1726853547.3558092-25300-50680159654956 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853547.38695: variable 'ansible_module_compression' from source: unknown 24160 1726853547.38751: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24160jdl187cr/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 24160 1726853547.38774: variable 'ansible_facts' from source: unknown 24160 1726853547.38888: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853547.3558092-25300-50680159654956/AnsiballZ_network_connections.py 24160 1726853547.39179: Sending initial data 24160 1726853547.39183: Sent initial data (167 bytes) 24160 1726853547.39572: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853547.39585: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853547.39641: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853547.39658: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853547.39713: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853547.41214: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24160 1726853547.41259: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24160 1726853547.41299: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24160jdl187cr/tmp1va0h4e9 /root/.ansible/tmp/ansible-tmp-1726853547.3558092-25300-50680159654956/AnsiballZ_network_connections.py <<< 24160 1726853547.41302: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853547.3558092-25300-50680159654956/AnsiballZ_network_connections.py" <<< 24160 1726853547.41336: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24160jdl187cr/tmp1va0h4e9" to remote "/root/.ansible/tmp/ansible-tmp-1726853547.3558092-25300-50680159654956/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853547.3558092-25300-50680159654956/AnsiballZ_network_connections.py" <<< 24160 1726853547.42068: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853547.42103: stderr chunk (state=3): >>><<< 24160 1726853547.42108: stdout chunk (state=3): >>><<< 24160 1726853547.42130: done transferring module to remote 24160 1726853547.42137: _low_level_execute_command(): starting 24160 1726853547.42142: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853547.3558092-25300-50680159654956/ /root/.ansible/tmp/ansible-tmp-1726853547.3558092-25300-50680159654956/AnsiballZ_network_connections.py && sleep 0' 24160 1726853547.42778: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853547.42781: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853547.42787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853547.42838: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853547.42849: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853547.42859: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853547.42984: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853547.44700: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853547.44719: stderr chunk (state=3): >>><<< 24160 1726853547.44722: stdout chunk (state=3): >>><<< 24160 1726853547.44734: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853547.44739: _low_level_execute_command(): starting 24160 1726853547.44748: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853547.3558092-25300-50680159654956/AnsiballZ_network_connections.py && sleep 0' 24160 1726853547.45148: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853547.45152: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853547.45156: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration <<< 24160 1726853547.45159: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24160 1726853547.45161: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853547.45208: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853547.45212: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853547.45261: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853547.71514: stdout chunk (state=3): >>> {"changed": false, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 24160 1726853547.73191: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 24160 1726853547.73195: stdout chunk (state=3): >>><<< 24160 1726853547.73200: stderr chunk (state=3): >>><<< 24160 1726853547.73295: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 24160 1726853547.73331: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'ethtest0', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853547.3558092-25300-50680159654956/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24160 1726853547.73341: _low_level_execute_command(): starting 24160 1726853547.73346: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853547.3558092-25300-50680159654956/ > /dev/null 2>&1 && sleep 0' 24160 1726853547.74707: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853547.74721: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853547.74731: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853547.74746: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24160 1726853547.74761: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 24160 1726853547.74768: stderr chunk (state=3): >>>debug2: match not found <<< 24160 1726853547.74780: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853547.74794: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24160 1726853547.74801: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address <<< 24160 1726853547.74814: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24160 1726853547.74820: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853547.75077: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853547.75393: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853547.75456: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853547.77480: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853547.77484: stdout chunk (state=3): >>><<< 24160 1726853547.77486: stderr chunk (state=3): >>><<< 24160 1726853547.77488: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853547.77490: handler run complete 24160 1726853547.77491: attempt loop complete, returning result 24160 1726853547.77493: _execute() done 24160 1726853547.77494: dumping result to json 24160 1726853547.77496: done dumping result, returning 24160 1726853547.77498: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [02083763-bbaf-5676-4eb4-000000000077] 24160 1726853547.77499: sending task result for task 02083763-bbaf-5676-4eb4-000000000077 24160 1726853547.77566: done sending task result for task 02083763-bbaf-5676-4eb4-000000000077 24160 1726853547.77569: WORKER PROCESS EXITING ok: [managed_node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false } STDERR: 24160 1726853547.77668: no more pending results, returning what we have 24160 1726853547.77674: results queue empty 24160 1726853547.77675: checking for any_errors_fatal 24160 1726853547.77682: done checking for any_errors_fatal 24160 1726853547.77683: checking for max_fail_percentage 24160 1726853547.77684: done checking for max_fail_percentage 24160 1726853547.77685: checking to see if all hosts have failed and the running result is not ok 24160 1726853547.77686: done checking to see if all hosts have failed 24160 1726853547.77687: getting the remaining hosts for this loop 24160 1726853547.77879: done getting the remaining hosts for this loop 24160 1726853547.77884: getting the next task for host managed_node1 24160 1726853547.77888: done getting next task for host managed_node1 24160 1726853547.77892: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 24160 1726853547.77894: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853547.77903: getting variables 24160 1726853547.77904: in VariableManager get_vars() 24160 1726853547.77937: Calling all_inventory to load vars for managed_node1 24160 1726853547.77940: Calling groups_inventory to load vars for managed_node1 24160 1726853547.77942: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853547.77951: Calling all_plugins_play to load vars for managed_node1 24160 1726853547.77954: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853547.77957: Calling groups_plugins_play to load vars for managed_node1 24160 1726853547.80188: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853547.81901: done with get_vars() 24160 1726853547.81924: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 13:32:27 -0400 (0:00:00.654) 0:00:24.222 ****** 24160 1726853547.82014: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 24160 1726853547.82402: worker is 1 (out of 1 available) 24160 1726853547.82414: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 24160 1726853547.82426: done queuing things up, now waiting for results queue to drain 24160 1726853547.82427: waiting for pending results... 24160 1726853547.82674: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 24160 1726853547.82787: in run() - task 02083763-bbaf-5676-4eb4-000000000078 24160 1726853547.82809: variable 'ansible_search_path' from source: unknown 24160 1726853547.82817: variable 'ansible_search_path' from source: unknown 24160 1726853547.82858: calling self._execute() 24160 1726853547.82968: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853547.82988: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853547.83089: variable 'omit' from source: magic vars 24160 1726853547.83389: variable 'ansible_distribution_major_version' from source: facts 24160 1726853547.83408: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853547.83531: variable 'connection_failed' from source: set_fact 24160 1726853547.83542: Evaluated conditional (not connection_failed): True 24160 1726853547.83853: variable 'ansible_distribution_major_version' from source: facts 24160 1726853547.83856: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853547.83903: variable 'connection_failed' from source: set_fact 24160 1726853547.83914: Evaluated conditional (not connection_failed): True 24160 1726853547.84031: variable 'network_state' from source: role '' defaults 24160 1726853547.84046: Evaluated conditional (network_state != {}): False 24160 1726853547.84054: when evaluation is False, skipping this task 24160 1726853547.84066: _execute() done 24160 1726853547.84079: dumping result to json 24160 1726853547.84086: done dumping result, returning 24160 1726853547.84098: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [02083763-bbaf-5676-4eb4-000000000078] 24160 1726853547.84109: sending task result for task 02083763-bbaf-5676-4eb4-000000000078 24160 1726853547.84316: done sending task result for task 02083763-bbaf-5676-4eb4-000000000078 24160 1726853547.84319: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 24160 1726853547.84389: no more pending results, returning what we have 24160 1726853547.84394: results queue empty 24160 1726853547.84395: checking for any_errors_fatal 24160 1726853547.84413: done checking for any_errors_fatal 24160 1726853547.84414: checking for max_fail_percentage 24160 1726853547.84416: done checking for max_fail_percentage 24160 1726853547.84417: checking to see if all hosts have failed and the running result is not ok 24160 1726853547.84420: done checking to see if all hosts have failed 24160 1726853547.84421: getting the remaining hosts for this loop 24160 1726853547.84423: done getting the remaining hosts for this loop 24160 1726853547.84426: getting the next task for host managed_node1 24160 1726853547.84432: done getting next task for host managed_node1 24160 1726853547.84435: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 24160 1726853547.84438: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853547.84452: getting variables 24160 1726853547.84453: in VariableManager get_vars() 24160 1726853547.84490: Calling all_inventory to load vars for managed_node1 24160 1726853547.84501: Calling groups_inventory to load vars for managed_node1 24160 1726853547.84504: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853547.84516: Calling all_plugins_play to load vars for managed_node1 24160 1726853547.84518: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853547.84521: Calling groups_plugins_play to load vars for managed_node1 24160 1726853547.85423: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853547.86357: done with get_vars() 24160 1726853547.86378: done getting variables 24160 1726853547.86434: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 13:32:27 -0400 (0:00:00.044) 0:00:24.267 ****** 24160 1726853547.86464: entering _queue_task() for managed_node1/debug 24160 1726853547.86760: worker is 1 (out of 1 available) 24160 1726853547.86775: exiting _queue_task() for managed_node1/debug 24160 1726853547.86786: done queuing things up, now waiting for results queue to drain 24160 1726853547.86788: waiting for pending results... 24160 1726853547.87188: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 24160 1726853547.87193: in run() - task 02083763-bbaf-5676-4eb4-000000000079 24160 1726853547.87196: variable 'ansible_search_path' from source: unknown 24160 1726853547.87199: variable 'ansible_search_path' from source: unknown 24160 1726853547.87218: calling self._execute() 24160 1726853547.87315: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853547.87319: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853547.87331: variable 'omit' from source: magic vars 24160 1726853547.87614: variable 'ansible_distribution_major_version' from source: facts 24160 1726853547.87624: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853547.87705: variable 'connection_failed' from source: set_fact 24160 1726853547.87708: Evaluated conditional (not connection_failed): True 24160 1726853547.87786: variable 'ansible_distribution_major_version' from source: facts 24160 1726853547.87790: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853547.87852: variable 'connection_failed' from source: set_fact 24160 1726853547.87858: Evaluated conditional (not connection_failed): True 24160 1726853547.87862: variable 'omit' from source: magic vars 24160 1726853547.87894: variable 'omit' from source: magic vars 24160 1726853547.87919: variable 'omit' from source: magic vars 24160 1726853547.87949: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24160 1726853547.87977: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24160 1726853547.87996: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24160 1726853547.88009: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853547.88018: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853547.88043: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24160 1726853547.88046: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853547.88048: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853547.88119: Set connection var ansible_shell_executable to /bin/sh 24160 1726853547.88124: Set connection var ansible_pipelining to False 24160 1726853547.88127: Set connection var ansible_connection to ssh 24160 1726853547.88129: Set connection var ansible_shell_type to sh 24160 1726853547.88136: Set connection var ansible_module_compression to ZIP_DEFLATED 24160 1726853547.88144: Set connection var ansible_timeout to 10 24160 1726853547.88162: variable 'ansible_shell_executable' from source: unknown 24160 1726853547.88165: variable 'ansible_connection' from source: unknown 24160 1726853547.88167: variable 'ansible_module_compression' from source: unknown 24160 1726853547.88169: variable 'ansible_shell_type' from source: unknown 24160 1726853547.88174: variable 'ansible_shell_executable' from source: unknown 24160 1726853547.88176: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853547.88178: variable 'ansible_pipelining' from source: unknown 24160 1726853547.88180: variable 'ansible_timeout' from source: unknown 24160 1726853547.88182: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853547.88285: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 24160 1726853547.88294: variable 'omit' from source: magic vars 24160 1726853547.88299: starting attempt loop 24160 1726853547.88302: running the handler 24160 1726853547.88392: variable '__network_connections_result' from source: set_fact 24160 1726853547.88433: handler run complete 24160 1726853547.88448: attempt loop complete, returning result 24160 1726853547.88451: _execute() done 24160 1726853547.88457: dumping result to json 24160 1726853547.88459: done dumping result, returning 24160 1726853547.88464: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [02083763-bbaf-5676-4eb4-000000000079] 24160 1726853547.88469: sending task result for task 02083763-bbaf-5676-4eb4-000000000079 24160 1726853547.88547: done sending task result for task 02083763-bbaf-5676-4eb4-000000000079 24160 1726853547.88550: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result.stderr_lines": [ "" ] } 24160 1726853547.88642: no more pending results, returning what we have 24160 1726853547.88645: results queue empty 24160 1726853547.88646: checking for any_errors_fatal 24160 1726853547.88650: done checking for any_errors_fatal 24160 1726853547.88651: checking for max_fail_percentage 24160 1726853547.88652: done checking for max_fail_percentage 24160 1726853547.88655: checking to see if all hosts have failed and the running result is not ok 24160 1726853547.88656: done checking to see if all hosts have failed 24160 1726853547.88657: getting the remaining hosts for this loop 24160 1726853547.88658: done getting the remaining hosts for this loop 24160 1726853547.88666: getting the next task for host managed_node1 24160 1726853547.88672: done getting next task for host managed_node1 24160 1726853547.88675: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 24160 1726853547.88677: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853547.88685: getting variables 24160 1726853547.88687: in VariableManager get_vars() 24160 1726853547.88716: Calling all_inventory to load vars for managed_node1 24160 1726853547.88718: Calling groups_inventory to load vars for managed_node1 24160 1726853547.88720: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853547.88728: Calling all_plugins_play to load vars for managed_node1 24160 1726853547.88730: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853547.88732: Calling groups_plugins_play to load vars for managed_node1 24160 1726853547.89496: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853547.90585: done with get_vars() 24160 1726853547.90607: done getting variables 24160 1726853547.90666: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 13:32:27 -0400 (0:00:00.042) 0:00:24.309 ****** 24160 1726853547.90695: entering _queue_task() for managed_node1/debug 24160 1726853547.91133: worker is 1 (out of 1 available) 24160 1726853547.91143: exiting _queue_task() for managed_node1/debug 24160 1726853547.91158: done queuing things up, now waiting for results queue to drain 24160 1726853547.91160: waiting for pending results... 24160 1726853547.91289: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 24160 1726853547.91336: in run() - task 02083763-bbaf-5676-4eb4-00000000007a 24160 1726853547.91378: variable 'ansible_search_path' from source: unknown 24160 1726853547.91383: variable 'ansible_search_path' from source: unknown 24160 1726853547.91386: calling self._execute() 24160 1726853547.91502: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853547.91506: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853547.91508: variable 'omit' from source: magic vars 24160 1726853547.91920: variable 'ansible_distribution_major_version' from source: facts 24160 1726853547.91923: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853547.91982: variable 'connection_failed' from source: set_fact 24160 1726853547.91985: Evaluated conditional (not connection_failed): True 24160 1726853547.92086: variable 'ansible_distribution_major_version' from source: facts 24160 1726853547.92091: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853547.92162: variable 'connection_failed' from source: set_fact 24160 1726853547.92165: Evaluated conditional (not connection_failed): True 24160 1726853547.92174: variable 'omit' from source: magic vars 24160 1726853547.92201: variable 'omit' from source: magic vars 24160 1726853547.92226: variable 'omit' from source: magic vars 24160 1726853547.92261: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24160 1726853547.92287: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24160 1726853547.92302: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24160 1726853547.92315: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853547.92324: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853547.92366: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24160 1726853547.92369: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853547.92373: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853547.92435: Set connection var ansible_shell_executable to /bin/sh 24160 1726853547.92440: Set connection var ansible_pipelining to False 24160 1726853547.92443: Set connection var ansible_connection to ssh 24160 1726853547.92448: Set connection var ansible_shell_type to sh 24160 1726853547.92460: Set connection var ansible_module_compression to ZIP_DEFLATED 24160 1726853547.92463: Set connection var ansible_timeout to 10 24160 1726853547.92482: variable 'ansible_shell_executable' from source: unknown 24160 1726853547.92484: variable 'ansible_connection' from source: unknown 24160 1726853547.92487: variable 'ansible_module_compression' from source: unknown 24160 1726853547.92490: variable 'ansible_shell_type' from source: unknown 24160 1726853547.92493: variable 'ansible_shell_executable' from source: unknown 24160 1726853547.92495: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853547.92497: variable 'ansible_pipelining' from source: unknown 24160 1726853547.92500: variable 'ansible_timeout' from source: unknown 24160 1726853547.92504: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853547.92605: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 24160 1726853547.92614: variable 'omit' from source: magic vars 24160 1726853547.92619: starting attempt loop 24160 1726853547.92622: running the handler 24160 1726853547.92662: variable '__network_connections_result' from source: set_fact 24160 1726853547.92715: variable '__network_connections_result' from source: set_fact 24160 1726853547.92789: handler run complete 24160 1726853547.92807: attempt loop complete, returning result 24160 1726853547.92809: _execute() done 24160 1726853547.92812: dumping result to json 24160 1726853547.92816: done dumping result, returning 24160 1726853547.92823: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [02083763-bbaf-5676-4eb4-00000000007a] 24160 1726853547.92828: sending task result for task 02083763-bbaf-5676-4eb4-00000000007a 24160 1726853547.92910: done sending task result for task 02083763-bbaf-5676-4eb4-00000000007a 24160 1726853547.92913: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 24160 1726853547.92991: no more pending results, returning what we have 24160 1726853547.92994: results queue empty 24160 1726853547.92995: checking for any_errors_fatal 24160 1726853547.93001: done checking for any_errors_fatal 24160 1726853547.93002: checking for max_fail_percentage 24160 1726853547.93003: done checking for max_fail_percentage 24160 1726853547.93004: checking to see if all hosts have failed and the running result is not ok 24160 1726853547.93004: done checking to see if all hosts have failed 24160 1726853547.93005: getting the remaining hosts for this loop 24160 1726853547.93006: done getting the remaining hosts for this loop 24160 1726853547.93009: getting the next task for host managed_node1 24160 1726853547.93014: done getting next task for host managed_node1 24160 1726853547.93017: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 24160 1726853547.93019: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853547.93033: getting variables 24160 1726853547.93035: in VariableManager get_vars() 24160 1726853547.93063: Calling all_inventory to load vars for managed_node1 24160 1726853547.93066: Calling groups_inventory to load vars for managed_node1 24160 1726853547.93068: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853547.93077: Calling all_plugins_play to load vars for managed_node1 24160 1726853547.93079: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853547.93082: Calling groups_plugins_play to load vars for managed_node1 24160 1726853547.93846: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853547.94830: done with get_vars() 24160 1726853547.94848: done getting variables 24160 1726853547.94912: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 13:32:27 -0400 (0:00:00.042) 0:00:24.352 ****** 24160 1726853547.94950: entering _queue_task() for managed_node1/debug 24160 1726853547.95267: worker is 1 (out of 1 available) 24160 1726853547.95281: exiting _queue_task() for managed_node1/debug 24160 1726853547.95293: done queuing things up, now waiting for results queue to drain 24160 1726853547.95295: waiting for pending results... 24160 1726853547.95748: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 24160 1726853547.95754: in run() - task 02083763-bbaf-5676-4eb4-00000000007b 24160 1726853547.95756: variable 'ansible_search_path' from source: unknown 24160 1726853547.95759: variable 'ansible_search_path' from source: unknown 24160 1726853547.95761: calling self._execute() 24160 1726853547.95844: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853547.95848: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853547.95861: variable 'omit' from source: magic vars 24160 1726853547.96216: variable 'ansible_distribution_major_version' from source: facts 24160 1726853547.96227: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853547.96313: variable 'connection_failed' from source: set_fact 24160 1726853547.96320: Evaluated conditional (not connection_failed): True 24160 1726853547.96417: variable 'ansible_distribution_major_version' from source: facts 24160 1726853547.96425: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853547.96514: variable 'connection_failed' from source: set_fact 24160 1726853547.96517: Evaluated conditional (not connection_failed): True 24160 1726853547.96732: variable 'network_state' from source: role '' defaults 24160 1726853547.96735: Evaluated conditional (network_state != {}): False 24160 1726853547.96737: when evaluation is False, skipping this task 24160 1726853547.96738: _execute() done 24160 1726853547.96740: dumping result to json 24160 1726853547.96742: done dumping result, returning 24160 1726853547.96744: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [02083763-bbaf-5676-4eb4-00000000007b] 24160 1726853547.96745: sending task result for task 02083763-bbaf-5676-4eb4-00000000007b skipping: [managed_node1] => { "false_condition": "network_state != {}" } 24160 1726853547.96865: no more pending results, returning what we have 24160 1726853547.96869: results queue empty 24160 1726853547.96869: checking for any_errors_fatal 24160 1726853547.96880: done checking for any_errors_fatal 24160 1726853547.96880: checking for max_fail_percentage 24160 1726853547.96882: done checking for max_fail_percentage 24160 1726853547.96883: checking to see if all hosts have failed and the running result is not ok 24160 1726853547.96884: done checking to see if all hosts have failed 24160 1726853547.96884: getting the remaining hosts for this loop 24160 1726853547.96886: done getting the remaining hosts for this loop 24160 1726853547.96889: getting the next task for host managed_node1 24160 1726853547.96893: done getting next task for host managed_node1 24160 1726853547.96899: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 24160 1726853547.96901: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853547.96913: getting variables 24160 1726853547.96915: in VariableManager get_vars() 24160 1726853547.96947: Calling all_inventory to load vars for managed_node1 24160 1726853547.96950: Calling groups_inventory to load vars for managed_node1 24160 1726853547.96952: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853547.96961: Calling all_plugins_play to load vars for managed_node1 24160 1726853547.96964: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853547.96967: Calling groups_plugins_play to load vars for managed_node1 24160 1726853547.97603: done sending task result for task 02083763-bbaf-5676-4eb4-00000000007b 24160 1726853547.97607: WORKER PROCESS EXITING 24160 1726853548.01603: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853548.02518: done with get_vars() 24160 1726853548.02535: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 13:32:28 -0400 (0:00:00.076) 0:00:24.428 ****** 24160 1726853548.02592: entering _queue_task() for managed_node1/ping 24160 1726853548.02851: worker is 1 (out of 1 available) 24160 1726853548.02867: exiting _queue_task() for managed_node1/ping 24160 1726853548.02881: done queuing things up, now waiting for results queue to drain 24160 1726853548.02882: waiting for pending results... 24160 1726853548.03060: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 24160 1726853548.03137: in run() - task 02083763-bbaf-5676-4eb4-00000000007c 24160 1726853548.03149: variable 'ansible_search_path' from source: unknown 24160 1726853548.03152: variable 'ansible_search_path' from source: unknown 24160 1726853548.03182: calling self._execute() 24160 1726853548.03260: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853548.03263: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853548.03272: variable 'omit' from source: magic vars 24160 1726853548.03558: variable 'ansible_distribution_major_version' from source: facts 24160 1726853548.03566: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853548.03639: variable 'connection_failed' from source: set_fact 24160 1726853548.03643: Evaluated conditional (not connection_failed): True 24160 1726853548.03718: variable 'ansible_distribution_major_version' from source: facts 24160 1726853548.03721: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853548.03790: variable 'connection_failed' from source: set_fact 24160 1726853548.03794: Evaluated conditional (not connection_failed): True 24160 1726853548.03800: variable 'omit' from source: magic vars 24160 1726853548.03828: variable 'omit' from source: magic vars 24160 1726853548.03851: variable 'omit' from source: magic vars 24160 1726853548.03887: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24160 1726853548.03912: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24160 1726853548.03928: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24160 1726853548.03940: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853548.03949: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853548.03974: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24160 1726853548.03977: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853548.03990: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853548.04049: Set connection var ansible_shell_executable to /bin/sh 24160 1726853548.04056: Set connection var ansible_pipelining to False 24160 1726853548.04060: Set connection var ansible_connection to ssh 24160 1726853548.04062: Set connection var ansible_shell_type to sh 24160 1726853548.04065: Set connection var ansible_module_compression to ZIP_DEFLATED 24160 1726853548.04075: Set connection var ansible_timeout to 10 24160 1726853548.04099: variable 'ansible_shell_executable' from source: unknown 24160 1726853548.04102: variable 'ansible_connection' from source: unknown 24160 1726853548.04105: variable 'ansible_module_compression' from source: unknown 24160 1726853548.04107: variable 'ansible_shell_type' from source: unknown 24160 1726853548.04110: variable 'ansible_shell_executable' from source: unknown 24160 1726853548.04112: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853548.04114: variable 'ansible_pipelining' from source: unknown 24160 1726853548.04116: variable 'ansible_timeout' from source: unknown 24160 1726853548.04119: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853548.04260: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 24160 1726853548.04269: variable 'omit' from source: magic vars 24160 1726853548.04275: starting attempt loop 24160 1726853548.04277: running the handler 24160 1726853548.04290: _low_level_execute_command(): starting 24160 1726853548.04296: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24160 1726853548.04815: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853548.04819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 24160 1726853548.04822: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 24160 1726853548.04825: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853548.04873: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853548.04877: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853548.04934: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853548.06625: stdout chunk (state=3): >>>/root <<< 24160 1726853548.06726: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853548.06977: stderr chunk (state=3): >>><<< 24160 1726853548.06981: stdout chunk (state=3): >>><<< 24160 1726853548.06987: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853548.06990: _low_level_execute_command(): starting 24160 1726853548.06992: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853548.067943-25341-118078347473559 `" && echo ansible-tmp-1726853548.067943-25341-118078347473559="` echo /root/.ansible/tmp/ansible-tmp-1726853548.067943-25341-118078347473559 `" ) && sleep 0' 24160 1726853548.07500: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853548.07511: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24160 1726853548.07527: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 24160 1726853548.07532: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853548.07552: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration <<< 24160 1726853548.07557: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853548.07581: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24160 1726853548.07617: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 24160 1726853548.07620: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853548.07711: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853548.07717: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853548.07852: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853548.09694: stdout chunk (state=3): >>>ansible-tmp-1726853548.067943-25341-118078347473559=/root/.ansible/tmp/ansible-tmp-1726853548.067943-25341-118078347473559 <<< 24160 1726853548.09845: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853548.09863: stdout chunk (state=3): >>><<< 24160 1726853548.09880: stderr chunk (state=3): >>><<< 24160 1726853548.10079: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853548.067943-25341-118078347473559=/root/.ansible/tmp/ansible-tmp-1726853548.067943-25341-118078347473559 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853548.10083: variable 'ansible_module_compression' from source: unknown 24160 1726853548.10085: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24160jdl187cr/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 24160 1726853548.10087: variable 'ansible_facts' from source: unknown 24160 1726853548.10145: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853548.067943-25341-118078347473559/AnsiballZ_ping.py 24160 1726853548.10257: Sending initial data 24160 1726853548.10260: Sent initial data (152 bytes) 24160 1726853548.10919: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853548.11092: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853548.11174: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853548.11199: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853548.12706: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 24160 1726853548.12713: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24160 1726853548.12743: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24160 1726853548.12790: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24160jdl187cr/tmps9bvgrgs /root/.ansible/tmp/ansible-tmp-1726853548.067943-25341-118078347473559/AnsiballZ_ping.py <<< 24160 1726853548.12793: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853548.067943-25341-118078347473559/AnsiballZ_ping.py" <<< 24160 1726853548.12825: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24160jdl187cr/tmps9bvgrgs" to remote "/root/.ansible/tmp/ansible-tmp-1726853548.067943-25341-118078347473559/AnsiballZ_ping.py" <<< 24160 1726853548.12830: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853548.067943-25341-118078347473559/AnsiballZ_ping.py" <<< 24160 1726853548.13363: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853548.13370: stderr chunk (state=3): >>><<< 24160 1726853548.13375: stdout chunk (state=3): >>><<< 24160 1726853548.13416: done transferring module to remote 24160 1726853548.13426: _low_level_execute_command(): starting 24160 1726853548.13431: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853548.067943-25341-118078347473559/ /root/.ansible/tmp/ansible-tmp-1726853548.067943-25341-118078347473559/AnsiballZ_ping.py && sleep 0' 24160 1726853548.13848: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853548.13855: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24160 1726853548.13877: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853548.13884: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address <<< 24160 1726853548.13899: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853548.13902: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853548.13954: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853548.13958: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853548.13961: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853548.13995: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853548.15864: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853548.15867: stdout chunk (state=3): >>><<< 24160 1726853548.15869: stderr chunk (state=3): >>><<< 24160 1726853548.15874: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853548.15877: _low_level_execute_command(): starting 24160 1726853548.15879: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853548.067943-25341-118078347473559/AnsiballZ_ping.py && sleep 0' 24160 1726853548.16585: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853548.16676: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853548.16689: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853548.16833: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853548.16887: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853548.31877: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 24160 1726853548.33194: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853548.33321: stderr chunk (state=3): >>>Shared connection to 10.31.45.153 closed. <<< 24160 1726853548.33325: stdout chunk (state=3): >>><<< 24160 1726853548.33375: stderr chunk (state=3): >>><<< 24160 1726853548.33379: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 24160 1726853548.33382: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853548.067943-25341-118078347473559/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24160 1726853548.33395: _low_level_execute_command(): starting 24160 1726853548.33403: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853548.067943-25341-118078347473559/ > /dev/null 2>&1 && sleep 0' 24160 1726853548.34443: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853548.34474: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853548.34677: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853548.34682: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853548.34726: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853548.34747: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853548.34767: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853548.34834: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853548.36877: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853548.36882: stdout chunk (state=3): >>><<< 24160 1726853548.36885: stderr chunk (state=3): >>><<< 24160 1726853548.36888: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853548.36891: handler run complete 24160 1726853548.36893: attempt loop complete, returning result 24160 1726853548.36895: _execute() done 24160 1726853548.36898: dumping result to json 24160 1726853548.36900: done dumping result, returning 24160 1726853548.36902: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [02083763-bbaf-5676-4eb4-00000000007c] 24160 1726853548.36906: sending task result for task 02083763-bbaf-5676-4eb4-00000000007c 24160 1726853548.36977: done sending task result for task 02083763-bbaf-5676-4eb4-00000000007c 24160 1726853548.36981: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "ping": "pong" } 24160 1726853548.37048: no more pending results, returning what we have 24160 1726853548.37052: results queue empty 24160 1726853548.37053: checking for any_errors_fatal 24160 1726853548.37066: done checking for any_errors_fatal 24160 1726853548.37067: checking for max_fail_percentage 24160 1726853548.37069: done checking for max_fail_percentage 24160 1726853548.37070: checking to see if all hosts have failed and the running result is not ok 24160 1726853548.37073: done checking to see if all hosts have failed 24160 1726853548.37073: getting the remaining hosts for this loop 24160 1726853548.37075: done getting the remaining hosts for this loop 24160 1726853548.37079: getting the next task for host managed_node1 24160 1726853548.37087: done getting next task for host managed_node1 24160 1726853548.37094: ^ task is: TASK: meta (role_complete) 24160 1726853548.37097: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853548.37108: getting variables 24160 1726853548.37111: in VariableManager get_vars() 24160 1726853548.37150: Calling all_inventory to load vars for managed_node1 24160 1726853548.37152: Calling groups_inventory to load vars for managed_node1 24160 1726853548.37157: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853548.37168: Calling all_plugins_play to load vars for managed_node1 24160 1726853548.37550: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853548.37558: Calling groups_plugins_play to load vars for managed_node1 24160 1726853548.39250: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853548.40317: done with get_vars() 24160 1726853548.40335: done getting variables 24160 1726853548.40402: done queuing things up, now waiting for results queue to drain 24160 1726853548.40404: results queue empty 24160 1726853548.40404: checking for any_errors_fatal 24160 1726853548.40406: done checking for any_errors_fatal 24160 1726853548.40407: checking for max_fail_percentage 24160 1726853548.40407: done checking for max_fail_percentage 24160 1726853548.40408: checking to see if all hosts have failed and the running result is not ok 24160 1726853548.40408: done checking to see if all hosts have failed 24160 1726853548.40409: getting the remaining hosts for this loop 24160 1726853548.40409: done getting the remaining hosts for this loop 24160 1726853548.40411: getting the next task for host managed_node1 24160 1726853548.40414: done getting next task for host managed_node1 24160 1726853548.40415: ^ task is: TASK: meta (flush_handlers) 24160 1726853548.40416: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853548.40418: getting variables 24160 1726853548.40419: in VariableManager get_vars() 24160 1726853548.40427: Calling all_inventory to load vars for managed_node1 24160 1726853548.40429: Calling groups_inventory to load vars for managed_node1 24160 1726853548.40430: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853548.40433: Calling all_plugins_play to load vars for managed_node1 24160 1726853548.40434: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853548.40436: Calling groups_plugins_play to load vars for managed_node1 24160 1726853548.41266: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853548.42560: done with get_vars() 24160 1726853548.42577: done getting variables 24160 1726853548.42626: in VariableManager get_vars() 24160 1726853548.42638: Calling all_inventory to load vars for managed_node1 24160 1726853548.42640: Calling groups_inventory to load vars for managed_node1 24160 1726853548.42642: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853548.42646: Calling all_plugins_play to load vars for managed_node1 24160 1726853548.42648: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853548.42651: Calling groups_plugins_play to load vars for managed_node1 24160 1726853548.43710: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853548.44651: done with get_vars() 24160 1726853548.44668: done queuing things up, now waiting for results queue to drain 24160 1726853548.44670: results queue empty 24160 1726853548.44672: checking for any_errors_fatal 24160 1726853548.44673: done checking for any_errors_fatal 24160 1726853548.44673: checking for max_fail_percentage 24160 1726853548.44674: done checking for max_fail_percentage 24160 1726853548.44675: checking to see if all hosts have failed and the running result is not ok 24160 1726853548.44675: done checking to see if all hosts have failed 24160 1726853548.44675: getting the remaining hosts for this loop 24160 1726853548.44676: done getting the remaining hosts for this loop 24160 1726853548.44678: getting the next task for host managed_node1 24160 1726853548.44681: done getting next task for host managed_node1 24160 1726853548.44684: ^ task is: TASK: meta (flush_handlers) 24160 1726853548.44686: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853548.44687: getting variables 24160 1726853548.44688: in VariableManager get_vars() 24160 1726853548.44695: Calling all_inventory to load vars for managed_node1 24160 1726853548.44697: Calling groups_inventory to load vars for managed_node1 24160 1726853548.44698: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853548.44701: Calling all_plugins_play to load vars for managed_node1 24160 1726853548.44703: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853548.44705: Calling groups_plugins_play to load vars for managed_node1 24160 1726853548.45714: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853548.46743: done with get_vars() 24160 1726853548.46757: done getting variables 24160 1726853548.46797: in VariableManager get_vars() 24160 1726853548.46805: Calling all_inventory to load vars for managed_node1 24160 1726853548.46806: Calling groups_inventory to load vars for managed_node1 24160 1726853548.46808: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853548.46811: Calling all_plugins_play to load vars for managed_node1 24160 1726853548.46812: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853548.46814: Calling groups_plugins_play to load vars for managed_node1 24160 1726853548.48298: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853548.49843: done with get_vars() 24160 1726853548.49861: done queuing things up, now waiting for results queue to drain 24160 1726853548.49862: results queue empty 24160 1726853548.49863: checking for any_errors_fatal 24160 1726853548.49864: done checking for any_errors_fatal 24160 1726853548.49864: checking for max_fail_percentage 24160 1726853548.49865: done checking for max_fail_percentage 24160 1726853548.49865: checking to see if all hosts have failed and the running result is not ok 24160 1726853548.49866: done checking to see if all hosts have failed 24160 1726853548.49866: getting the remaining hosts for this loop 24160 1726853548.49867: done getting the remaining hosts for this loop 24160 1726853548.49868: getting the next task for host managed_node1 24160 1726853548.49873: done getting next task for host managed_node1 24160 1726853548.49874: ^ task is: None 24160 1726853548.49875: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853548.49876: done queuing things up, now waiting for results queue to drain 24160 1726853548.49876: results queue empty 24160 1726853548.49877: checking for any_errors_fatal 24160 1726853548.49877: done checking for any_errors_fatal 24160 1726853548.49878: checking for max_fail_percentage 24160 1726853548.49878: done checking for max_fail_percentage 24160 1726853548.49879: checking to see if all hosts have failed and the running result is not ok 24160 1726853548.49880: done checking to see if all hosts have failed 24160 1726853548.49881: getting the next task for host managed_node1 24160 1726853548.49883: done getting next task for host managed_node1 24160 1726853548.49883: ^ task is: None 24160 1726853548.49884: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853548.49918: in VariableManager get_vars() 24160 1726853548.49933: done with get_vars() 24160 1726853548.49936: in VariableManager get_vars() 24160 1726853548.49944: done with get_vars() 24160 1726853548.49947: variable 'omit' from source: magic vars 24160 1726853548.50034: variable 'profile' from source: play vars 24160 1726853548.50107: in VariableManager get_vars() 24160 1726853548.50116: done with get_vars() 24160 1726853548.50130: variable 'omit' from source: magic vars 24160 1726853548.50175: variable 'profile' from source: play vars PLAY [Remove {{ profile }}] **************************************************** 24160 1726853548.50609: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 24160 1726853548.50628: getting the remaining hosts for this loop 24160 1726853548.50629: done getting the remaining hosts for this loop 24160 1726853548.50632: getting the next task for host managed_node1 24160 1726853548.50634: done getting next task for host managed_node1 24160 1726853548.50637: ^ task is: TASK: Gathering Facts 24160 1726853548.50638: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853548.50639: getting variables 24160 1726853548.50640: in VariableManager get_vars() 24160 1726853548.50647: Calling all_inventory to load vars for managed_node1 24160 1726853548.50649: Calling groups_inventory to load vars for managed_node1 24160 1726853548.50650: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853548.50654: Calling all_plugins_play to load vars for managed_node1 24160 1726853548.50656: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853548.50658: Calling groups_plugins_play to load vars for managed_node1 24160 1726853548.51323: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853548.52329: done with get_vars() 24160 1726853548.52348: done getting variables 24160 1726853548.52393: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 Friday 20 September 2024 13:32:28 -0400 (0:00:00.498) 0:00:24.926 ****** 24160 1726853548.52418: entering _queue_task() for managed_node1/gather_facts 24160 1726853548.52750: worker is 1 (out of 1 available) 24160 1726853548.52760: exiting _queue_task() for managed_node1/gather_facts 24160 1726853548.52772: done queuing things up, now waiting for results queue to drain 24160 1726853548.52774: waiting for pending results... 24160 1726853548.53188: running TaskExecutor() for managed_node1/TASK: Gathering Facts 24160 1726853548.53193: in run() - task 02083763-bbaf-5676-4eb4-000000000521 24160 1726853548.53196: variable 'ansible_search_path' from source: unknown 24160 1726853548.53199: calling self._execute() 24160 1726853548.53297: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853548.53310: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853548.53323: variable 'omit' from source: magic vars 24160 1726853548.53750: variable 'ansible_distribution_major_version' from source: facts 24160 1726853548.53753: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853548.53756: variable 'omit' from source: magic vars 24160 1726853548.53758: variable 'omit' from source: magic vars 24160 1726853548.53783: variable 'omit' from source: magic vars 24160 1726853548.53825: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24160 1726853548.53868: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24160 1726853548.53897: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24160 1726853548.53920: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853548.53936: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853548.53973: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24160 1726853548.53982: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853548.53988: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853548.54086: Set connection var ansible_shell_executable to /bin/sh 24160 1726853548.54186: Set connection var ansible_pipelining to False 24160 1726853548.54190: Set connection var ansible_connection to ssh 24160 1726853548.54192: Set connection var ansible_shell_type to sh 24160 1726853548.54194: Set connection var ansible_module_compression to ZIP_DEFLATED 24160 1726853548.54197: Set connection var ansible_timeout to 10 24160 1726853548.54199: variable 'ansible_shell_executable' from source: unknown 24160 1726853548.54201: variable 'ansible_connection' from source: unknown 24160 1726853548.54203: variable 'ansible_module_compression' from source: unknown 24160 1726853548.54205: variable 'ansible_shell_type' from source: unknown 24160 1726853548.54208: variable 'ansible_shell_executable' from source: unknown 24160 1726853548.54210: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853548.54212: variable 'ansible_pipelining' from source: unknown 24160 1726853548.54214: variable 'ansible_timeout' from source: unknown 24160 1726853548.54216: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853548.54404: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 24160 1726853548.54420: variable 'omit' from source: magic vars 24160 1726853548.54431: starting attempt loop 24160 1726853548.54438: running the handler 24160 1726853548.54457: variable 'ansible_facts' from source: unknown 24160 1726853548.54485: _low_level_execute_command(): starting 24160 1726853548.54499: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24160 1726853548.55231: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853548.55245: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853548.55268: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853548.55378: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853548.55429: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853548.55483: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853548.57147: stdout chunk (state=3): >>>/root <<< 24160 1726853548.57277: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853548.57280: stdout chunk (state=3): >>><<< 24160 1726853548.57283: stderr chunk (state=3): >>><<< 24160 1726853548.57298: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853548.57314: _low_level_execute_command(): starting 24160 1726853548.57318: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853548.572992-25366-240732860458741 `" && echo ansible-tmp-1726853548.572992-25366-240732860458741="` echo /root/.ansible/tmp/ansible-tmp-1726853548.572992-25366-240732860458741 `" ) && sleep 0' 24160 1726853548.57741: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853548.57744: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 24160 1726853548.57747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration <<< 24160 1726853548.57756: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 24160 1726853548.57758: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853548.57802: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853548.57810: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853548.57850: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853548.59727: stdout chunk (state=3): >>>ansible-tmp-1726853548.572992-25366-240732860458741=/root/.ansible/tmp/ansible-tmp-1726853548.572992-25366-240732860458741 <<< 24160 1726853548.59880: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853548.59883: stdout chunk (state=3): >>><<< 24160 1726853548.59885: stderr chunk (state=3): >>><<< 24160 1726853548.60077: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853548.572992-25366-240732860458741=/root/.ansible/tmp/ansible-tmp-1726853548.572992-25366-240732860458741 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853548.60081: variable 'ansible_module_compression' from source: unknown 24160 1726853548.60083: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24160jdl187cr/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 24160 1726853548.60085: variable 'ansible_facts' from source: unknown 24160 1726853548.60221: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853548.572992-25366-240732860458741/AnsiballZ_setup.py 24160 1726853548.60332: Sending initial data 24160 1726853548.60344: Sent initial data (153 bytes) 24160 1726853548.60761: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853548.60764: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853548.60767: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853548.60769: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853548.60822: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853548.60829: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853548.60869: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853548.62410: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 24160 1726853548.62413: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24160 1726853548.62444: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24160 1726853548.62484: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24160jdl187cr/tmpwxq0q4fj /root/.ansible/tmp/ansible-tmp-1726853548.572992-25366-240732860458741/AnsiballZ_setup.py <<< 24160 1726853548.62489: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853548.572992-25366-240732860458741/AnsiballZ_setup.py" <<< 24160 1726853548.62520: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24160jdl187cr/tmpwxq0q4fj" to remote "/root/.ansible/tmp/ansible-tmp-1726853548.572992-25366-240732860458741/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853548.572992-25366-240732860458741/AnsiballZ_setup.py" <<< 24160 1726853548.63760: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853548.63763: stdout chunk (state=3): >>><<< 24160 1726853548.63766: stderr chunk (state=3): >>><<< 24160 1726853548.63768: done transferring module to remote 24160 1726853548.63770: _low_level_execute_command(): starting 24160 1726853548.63774: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853548.572992-25366-240732860458741/ /root/.ansible/tmp/ansible-tmp-1726853548.572992-25366-240732860458741/AnsiballZ_setup.py && sleep 0' 24160 1726853548.64348: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853548.64389: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853548.66134: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853548.66184: stderr chunk (state=3): >>><<< 24160 1726853548.66215: stdout chunk (state=3): >>><<< 24160 1726853548.66306: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853548.66310: _low_level_execute_command(): starting 24160 1726853548.66312: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853548.572992-25366-240732860458741/AnsiballZ_setup.py && sleep 0' 24160 1726853548.66867: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853548.66885: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853548.66947: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853548.67010: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853548.67027: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853548.67070: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853548.67151: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853549.32377: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 44238 10.31.45.153 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 44238 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_apparmor": {"status": "disabled"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-153", "ansible_nodename": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26b9e88796a7cb9ebdc2656ce384f6", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "32", "second": "28", "epoch": "1726853548", "epoch_int": "1726853548", "date": "2024-09-20", "time": "13:32:28", "iso8601_micro": "2024-09-20T17:32:28.940458Z", "iso8601": "2024-09-20T17:32:28Z", "iso8601_basic": "20240920T133228940458", "iso8601_basic_short": "20240920T133228", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_is_chroot": false, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCXYnrsBaUY4i0/t1QUWoZkFPnFbGncbkmF01/zUZNuwldCwqYoDxpR2K8ARraEuK9oVLyYO0alCszdGP42db6R4xfRCOhN3faeZXsneupsJk4LLpIBkq0uIokeAtcL1tPOUQQzfsQqzZzp4BmJCVrwmUW5ADnzqCgvB3gsruyTQUrEUJ9MtB5zdaQm5MXuipjeZQThTjYCB2aXv/qTdzfKAwds3CoSZ6HA5GNdi6tahsy3CRIn6VtVkvwrqjJcwo+RaRQzjh+C9AUoH2YQmfLbvog62MsnLk/5OPq5HhxO81pm/TJDsI4LXwLh1VviMOWzVvIaXuKwdmYAdgX1NU561bBzeYzi55qBKo4TcMmnOXiV+Es7dDrKjwwpQKsv5tjSqVkcO6ek3I6SI38DXFOBLZtqXOOLsO12iOReYJUWe/+cgBrz12kDCPmaHFzFFZ3+N0GQ/WiYcgqiUItIhb3xJTbPqls0czPCpCzKo57GyTmv17fpfGhBfRoGX/H1zYs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDOnt+7F+RlMaCRRXs8YyuhiHP1FmeDlj4rNB/K2mg1iP9loXXc/XjJ083xMBDu7m7uYLGB+dnmj299Y+RcAQpE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKmxoIJtw8UORlc+o+Q7Pks5ERSyhMLnl+Oo8W221WGj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_local": {}, "ansible_distribution":<<< 24160 1726853549.32409: stdout chunk (state=3): >>> "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_pkg_mgr": "dnf", "ansible_iscsi_iqn": "", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_lsb": {}, "ansible_loadavg": {"1m": 0.48974609375, "5m": 0.36474609375, "15m": 0.19677734375}, "ansible_fibre_channel_wwn": [], "ansible_interfaces": ["lo", "peerethtest0", "eth0", "ethtest0"], "ansible_eth0": {"device": "eth0", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::3a:e7ff:fe40:bc9f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_ethtest0": {"device": "ethtest0", "macaddress": "46:cb:a9:60:52:30", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::44cb:a9ff:fe60:5230", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerethtest0": {"device": "peerethtest0", "macaddress": "ba:78:39:74:8a:f6", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::b878:39ff:fe74:8af6", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.153"], "ansible_all_ipv6_addresses": ["fe80::3a:e7ff:fe40:bc9f", "fe80::44cb:a9ff:fe60:5230", "fe80::b878:39ff:fe74:8af6"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.153", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::3a:e7ff:fe40:bc9f", "fe80::44cb:a9ff:fe60:5230", "fe80::b878:39ff:fe74:8af6"]}, "ansible_fips": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2960, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 571, "free": 2960}, "nocache": {"free": 3299, "used": 232}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_uuid": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 715, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794672640, "block_size": 4096, "block_total": 65519099, "block_available": 63914715, "block_used": 1604384, "inode_total": 131070960, "inode_available": 131029066, "inode_used": 41894, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 24160 1726853549.34380: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 24160 1726853549.34405: stdout chunk (state=3): >>><<< 24160 1726853549.34409: stderr chunk (state=3): >>><<< 24160 1726853549.34475: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 44238 10.31.45.153 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 44238 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_apparmor": {"status": "disabled"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-153", "ansible_nodename": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26b9e88796a7cb9ebdc2656ce384f6", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "32", "second": "28", "epoch": "1726853548", "epoch_int": "1726853548", "date": "2024-09-20", "time": "13:32:28", "iso8601_micro": "2024-09-20T17:32:28.940458Z", "iso8601": "2024-09-20T17:32:28Z", "iso8601_basic": "20240920T133228940458", "iso8601_basic_short": "20240920T133228", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_is_chroot": false, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCXYnrsBaUY4i0/t1QUWoZkFPnFbGncbkmF01/zUZNuwldCwqYoDxpR2K8ARraEuK9oVLyYO0alCszdGP42db6R4xfRCOhN3faeZXsneupsJk4LLpIBkq0uIokeAtcL1tPOUQQzfsQqzZzp4BmJCVrwmUW5ADnzqCgvB3gsruyTQUrEUJ9MtB5zdaQm5MXuipjeZQThTjYCB2aXv/qTdzfKAwds3CoSZ6HA5GNdi6tahsy3CRIn6VtVkvwrqjJcwo+RaRQzjh+C9AUoH2YQmfLbvog62MsnLk/5OPq5HhxO81pm/TJDsI4LXwLh1VviMOWzVvIaXuKwdmYAdgX1NU561bBzeYzi55qBKo4TcMmnOXiV+Es7dDrKjwwpQKsv5tjSqVkcO6ek3I6SI38DXFOBLZtqXOOLsO12iOReYJUWe/+cgBrz12kDCPmaHFzFFZ3+N0GQ/WiYcgqiUItIhb3xJTbPqls0czPCpCzKo57GyTmv17fpfGhBfRoGX/H1zYs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDOnt+7F+RlMaCRRXs8YyuhiHP1FmeDlj4rNB/K2mg1iP9loXXc/XjJ083xMBDu7m7uYLGB+dnmj299Y+RcAQpE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKmxoIJtw8UORlc+o+Q7Pks5ERSyhMLnl+Oo8W221WGj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_local": {}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_pkg_mgr": "dnf", "ansible_iscsi_iqn": "", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_lsb": {}, "ansible_loadavg": {"1m": 0.48974609375, "5m": 0.36474609375, "15m": 0.19677734375}, "ansible_fibre_channel_wwn": [], "ansible_interfaces": ["lo", "peerethtest0", "eth0", "ethtest0"], "ansible_eth0": {"device": "eth0", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::3a:e7ff:fe40:bc9f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_ethtest0": {"device": "ethtest0", "macaddress": "46:cb:a9:60:52:30", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::44cb:a9ff:fe60:5230", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerethtest0": {"device": "peerethtest0", "macaddress": "ba:78:39:74:8a:f6", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::b878:39ff:fe74:8af6", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.153"], "ansible_all_ipv6_addresses": ["fe80::3a:e7ff:fe40:bc9f", "fe80::44cb:a9ff:fe60:5230", "fe80::b878:39ff:fe74:8af6"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.153", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::3a:e7ff:fe40:bc9f", "fe80::44cb:a9ff:fe60:5230", "fe80::b878:39ff:fe74:8af6"]}, "ansible_fips": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2960, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 571, "free": 2960}, "nocache": {"free": 3299, "used": 232}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_uuid": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 715, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794672640, "block_size": 4096, "block_total": 65519099, "block_available": 63914715, "block_used": 1604384, "inode_total": 131070960, "inode_available": 131029066, "inode_used": 41894, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 24160 1726853549.35178: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853548.572992-25366-240732860458741/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24160 1726853549.35181: _low_level_execute_command(): starting 24160 1726853549.35184: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853548.572992-25366-240732860458741/ > /dev/null 2>&1 && sleep 0' 24160 1726853549.36361: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853549.36385: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853549.36402: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853549.36475: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853549.36531: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853549.36568: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853549.36659: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853549.38541: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853549.38545: stdout chunk (state=3): >>><<< 24160 1726853549.38547: stderr chunk (state=3): >>><<< 24160 1726853549.38777: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853549.38781: handler run complete 24160 1726853549.38783: variable 'ansible_facts' from source: unknown 24160 1726853549.38881: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853549.39288: variable 'ansible_facts' from source: unknown 24160 1726853549.39397: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853549.39589: attempt loop complete, returning result 24160 1726853549.39599: _execute() done 24160 1726853549.39607: dumping result to json 24160 1726853549.39648: done dumping result, returning 24160 1726853549.39675: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [02083763-bbaf-5676-4eb4-000000000521] 24160 1726853549.39684: sending task result for task 02083763-bbaf-5676-4eb4-000000000521 ok: [managed_node1] 24160 1726853549.42056: no more pending results, returning what we have 24160 1726853549.42059: results queue empty 24160 1726853549.42060: checking for any_errors_fatal 24160 1726853549.42061: done checking for any_errors_fatal 24160 1726853549.42062: checking for max_fail_percentage 24160 1726853549.42063: done checking for max_fail_percentage 24160 1726853549.42064: checking to see if all hosts have failed and the running result is not ok 24160 1726853549.42065: done checking to see if all hosts have failed 24160 1726853549.42066: getting the remaining hosts for this loop 24160 1726853549.42067: done getting the remaining hosts for this loop 24160 1726853549.42073: getting the next task for host managed_node1 24160 1726853549.42078: done getting next task for host managed_node1 24160 1726853549.42080: ^ task is: TASK: meta (flush_handlers) 24160 1726853549.42088: done sending task result for task 02083763-bbaf-5676-4eb4-000000000521 24160 1726853549.42091: WORKER PROCESS EXITING 24160 1726853549.42092: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853549.42127: getting variables 24160 1726853549.42129: in VariableManager get_vars() 24160 1726853549.42158: Calling all_inventory to load vars for managed_node1 24160 1726853549.42161: Calling groups_inventory to load vars for managed_node1 24160 1726853549.42163: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853549.42176: Calling all_plugins_play to load vars for managed_node1 24160 1726853549.42179: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853549.42182: Calling groups_plugins_play to load vars for managed_node1 24160 1726853549.43620: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853549.45344: done with get_vars() 24160 1726853549.45369: done getting variables 24160 1726853549.45443: in VariableManager get_vars() 24160 1726853549.45459: Calling all_inventory to load vars for managed_node1 24160 1726853549.45462: Calling groups_inventory to load vars for managed_node1 24160 1726853549.45464: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853549.45468: Calling all_plugins_play to load vars for managed_node1 24160 1726853549.45472: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853549.45477: Calling groups_plugins_play to load vars for managed_node1 24160 1726853549.46344: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853549.47305: done with get_vars() 24160 1726853549.47321: done queuing things up, now waiting for results queue to drain 24160 1726853549.47323: results queue empty 24160 1726853549.47323: checking for any_errors_fatal 24160 1726853549.47326: done checking for any_errors_fatal 24160 1726853549.47326: checking for max_fail_percentage 24160 1726853549.47327: done checking for max_fail_percentage 24160 1726853549.47331: checking to see if all hosts have failed and the running result is not ok 24160 1726853549.47332: done checking to see if all hosts have failed 24160 1726853549.47332: getting the remaining hosts for this loop 24160 1726853549.47333: done getting the remaining hosts for this loop 24160 1726853549.47335: getting the next task for host managed_node1 24160 1726853549.47338: done getting next task for host managed_node1 24160 1726853549.47340: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 24160 1726853549.47342: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853549.47349: getting variables 24160 1726853549.47350: in VariableManager get_vars() 24160 1726853549.47362: Calling all_inventory to load vars for managed_node1 24160 1726853549.47363: Calling groups_inventory to load vars for managed_node1 24160 1726853549.47364: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853549.47368: Calling all_plugins_play to load vars for managed_node1 24160 1726853549.47369: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853549.47372: Calling groups_plugins_play to load vars for managed_node1 24160 1726853549.48005: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853549.49122: done with get_vars() 24160 1726853549.49141: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 13:32:29 -0400 (0:00:00.968) 0:00:25.894 ****** 24160 1726853549.49225: entering _queue_task() for managed_node1/include_tasks 24160 1726853549.49601: worker is 1 (out of 1 available) 24160 1726853549.49619: exiting _queue_task() for managed_node1/include_tasks 24160 1726853549.49630: done queuing things up, now waiting for results queue to drain 24160 1726853549.49632: waiting for pending results... 24160 1726853549.49813: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 24160 1726853549.49886: in run() - task 02083763-bbaf-5676-4eb4-000000000084 24160 1726853549.49903: variable 'ansible_search_path' from source: unknown 24160 1726853549.49906: variable 'ansible_search_path' from source: unknown 24160 1726853549.49933: calling self._execute() 24160 1726853549.50024: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853549.50029: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853549.50037: variable 'omit' from source: magic vars 24160 1726853549.50332: variable 'ansible_distribution_major_version' from source: facts 24160 1726853549.50343: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853549.50348: _execute() done 24160 1726853549.50351: dumping result to json 24160 1726853549.50357: done dumping result, returning 24160 1726853549.50361: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [02083763-bbaf-5676-4eb4-000000000084] 24160 1726853549.50367: sending task result for task 02083763-bbaf-5676-4eb4-000000000084 24160 1726853549.50460: done sending task result for task 02083763-bbaf-5676-4eb4-000000000084 24160 1726853549.50462: WORKER PROCESS EXITING 24160 1726853549.50506: no more pending results, returning what we have 24160 1726853549.50511: in VariableManager get_vars() 24160 1726853549.50550: Calling all_inventory to load vars for managed_node1 24160 1726853549.50556: Calling groups_inventory to load vars for managed_node1 24160 1726853549.50558: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853549.50569: Calling all_plugins_play to load vars for managed_node1 24160 1726853549.50573: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853549.50576: Calling groups_plugins_play to load vars for managed_node1 24160 1726853549.51431: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853549.52310: done with get_vars() 24160 1726853549.52323: variable 'ansible_search_path' from source: unknown 24160 1726853549.52324: variable 'ansible_search_path' from source: unknown 24160 1726853549.52343: we have included files to process 24160 1726853549.52344: generating all_blocks data 24160 1726853549.52345: done generating all_blocks data 24160 1726853549.52346: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 24160 1726853549.52346: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 24160 1726853549.52348: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 24160 1726853549.52783: done processing included file 24160 1726853549.52785: iterating over new_blocks loaded from include file 24160 1726853549.52786: in VariableManager get_vars() 24160 1726853549.52806: done with get_vars() 24160 1726853549.52808: filtering new block on tags 24160 1726853549.52822: done filtering new block on tags 24160 1726853549.52825: in VariableManager get_vars() 24160 1726853549.52843: done with get_vars() 24160 1726853549.52845: filtering new block on tags 24160 1726853549.52863: done filtering new block on tags 24160 1726853549.52865: in VariableManager get_vars() 24160 1726853549.52885: done with get_vars() 24160 1726853549.52886: filtering new block on tags 24160 1726853549.52901: done filtering new block on tags 24160 1726853549.52903: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node1 24160 1726853549.52908: extending task lists for all hosts with included blocks 24160 1726853549.53263: done extending task lists 24160 1726853549.53264: done processing included files 24160 1726853549.53265: results queue empty 24160 1726853549.53265: checking for any_errors_fatal 24160 1726853549.53267: done checking for any_errors_fatal 24160 1726853549.53268: checking for max_fail_percentage 24160 1726853549.53270: done checking for max_fail_percentage 24160 1726853549.53274: checking to see if all hosts have failed and the running result is not ok 24160 1726853549.53274: done checking to see if all hosts have failed 24160 1726853549.53275: getting the remaining hosts for this loop 24160 1726853549.53276: done getting the remaining hosts for this loop 24160 1726853549.53279: getting the next task for host managed_node1 24160 1726853549.53283: done getting next task for host managed_node1 24160 1726853549.53286: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 24160 1726853549.53289: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853549.53297: getting variables 24160 1726853549.53298: in VariableManager get_vars() 24160 1726853549.53311: Calling all_inventory to load vars for managed_node1 24160 1726853549.53313: Calling groups_inventory to load vars for managed_node1 24160 1726853549.53315: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853549.53320: Calling all_plugins_play to load vars for managed_node1 24160 1726853549.53322: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853549.53325: Calling groups_plugins_play to load vars for managed_node1 24160 1726853549.54419: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853549.55302: done with get_vars() 24160 1726853549.55316: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 13:32:29 -0400 (0:00:00.061) 0:00:25.956 ****** 24160 1726853549.55368: entering _queue_task() for managed_node1/setup 24160 1726853549.55624: worker is 1 (out of 1 available) 24160 1726853549.55637: exiting _queue_task() for managed_node1/setup 24160 1726853549.55650: done queuing things up, now waiting for results queue to drain 24160 1726853549.55651: waiting for pending results... 24160 1726853549.55828: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 24160 1726853549.55918: in run() - task 02083763-bbaf-5676-4eb4-000000000562 24160 1726853549.55930: variable 'ansible_search_path' from source: unknown 24160 1726853549.55933: variable 'ansible_search_path' from source: unknown 24160 1726853549.55963: calling self._execute() 24160 1726853549.56037: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853549.56041: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853549.56050: variable 'omit' from source: magic vars 24160 1726853549.56322: variable 'ansible_distribution_major_version' from source: facts 24160 1726853549.56334: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853549.56486: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24160 1726853549.58523: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24160 1726853549.58528: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24160 1726853549.58558: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24160 1726853549.58586: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24160 1726853549.58619: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24160 1726853549.58698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853549.58713: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853549.58731: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853549.58758: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853549.58773: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853549.58976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853549.58979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853549.58982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853549.58985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853549.58986: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853549.59086: variable '__network_required_facts' from source: role '' defaults 24160 1726853549.59101: variable 'ansible_facts' from source: unknown 24160 1726853549.59788: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 24160 1726853549.59797: when evaluation is False, skipping this task 24160 1726853549.59805: _execute() done 24160 1726853549.59813: dumping result to json 24160 1726853549.59820: done dumping result, returning 24160 1726853549.59833: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [02083763-bbaf-5676-4eb4-000000000562] 24160 1726853549.59848: sending task result for task 02083763-bbaf-5676-4eb4-000000000562 24160 1726853549.59950: done sending task result for task 02083763-bbaf-5676-4eb4-000000000562 24160 1726853549.59954: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 24160 1726853549.59996: no more pending results, returning what we have 24160 1726853549.60001: results queue empty 24160 1726853549.60002: checking for any_errors_fatal 24160 1726853549.60003: done checking for any_errors_fatal 24160 1726853549.60004: checking for max_fail_percentage 24160 1726853549.60005: done checking for max_fail_percentage 24160 1726853549.60006: checking to see if all hosts have failed and the running result is not ok 24160 1726853549.60006: done checking to see if all hosts have failed 24160 1726853549.60007: getting the remaining hosts for this loop 24160 1726853549.60008: done getting the remaining hosts for this loop 24160 1726853549.60012: getting the next task for host managed_node1 24160 1726853549.60019: done getting next task for host managed_node1 24160 1726853549.60030: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 24160 1726853549.60034: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853549.60048: getting variables 24160 1726853549.60049: in VariableManager get_vars() 24160 1726853549.60089: Calling all_inventory to load vars for managed_node1 24160 1726853549.60091: Calling groups_inventory to load vars for managed_node1 24160 1726853549.60094: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853549.60103: Calling all_plugins_play to load vars for managed_node1 24160 1726853549.60105: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853549.60108: Calling groups_plugins_play to load vars for managed_node1 24160 1726853549.60959: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853549.61843: done with get_vars() 24160 1726853549.61858: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 13:32:29 -0400 (0:00:00.065) 0:00:26.021 ****** 24160 1726853549.61924: entering _queue_task() for managed_node1/stat 24160 1726853549.62129: worker is 1 (out of 1 available) 24160 1726853549.62143: exiting _queue_task() for managed_node1/stat 24160 1726853549.62156: done queuing things up, now waiting for results queue to drain 24160 1726853549.62157: waiting for pending results... 24160 1726853549.62340: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 24160 1726853549.62430: in run() - task 02083763-bbaf-5676-4eb4-000000000564 24160 1726853549.62441: variable 'ansible_search_path' from source: unknown 24160 1726853549.62444: variable 'ansible_search_path' from source: unknown 24160 1726853549.62475: calling self._execute() 24160 1726853549.62545: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853549.62549: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853549.62559: variable 'omit' from source: magic vars 24160 1726853549.63079: variable 'ansible_distribution_major_version' from source: facts 24160 1726853549.63082: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853549.63140: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24160 1726853549.63409: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24160 1726853549.63458: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24160 1726853549.63499: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24160 1726853549.63536: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24160 1726853549.63657: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24160 1726853549.63661: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24160 1726853549.63682: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853549.63695: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24160 1726853549.63762: variable '__network_is_ostree' from source: set_fact 24160 1726853549.63768: Evaluated conditional (not __network_is_ostree is defined): False 24160 1726853549.63773: when evaluation is False, skipping this task 24160 1726853549.63776: _execute() done 24160 1726853549.63778: dumping result to json 24160 1726853549.63780: done dumping result, returning 24160 1726853549.63787: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [02083763-bbaf-5676-4eb4-000000000564] 24160 1726853549.63792: sending task result for task 02083763-bbaf-5676-4eb4-000000000564 skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 24160 1726853549.63921: no more pending results, returning what we have 24160 1726853549.63925: results queue empty 24160 1726853549.63926: checking for any_errors_fatal 24160 1726853549.63932: done checking for any_errors_fatal 24160 1726853549.63933: checking for max_fail_percentage 24160 1726853549.63934: done checking for max_fail_percentage 24160 1726853549.63935: checking to see if all hosts have failed and the running result is not ok 24160 1726853549.63936: done checking to see if all hosts have failed 24160 1726853549.63937: getting the remaining hosts for this loop 24160 1726853549.63938: done getting the remaining hosts for this loop 24160 1726853549.63941: getting the next task for host managed_node1 24160 1726853549.63946: done getting next task for host managed_node1 24160 1726853549.63952: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 24160 1726853549.63954: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853549.63967: getting variables 24160 1726853549.63968: in VariableManager get_vars() 24160 1726853549.64001: Calling all_inventory to load vars for managed_node1 24160 1726853549.64004: Calling groups_inventory to load vars for managed_node1 24160 1726853549.64006: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853549.64013: Calling all_plugins_play to load vars for managed_node1 24160 1726853549.64016: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853549.64018: Calling groups_plugins_play to load vars for managed_node1 24160 1726853549.64800: done sending task result for task 02083763-bbaf-5676-4eb4-000000000564 24160 1726853549.64803: WORKER PROCESS EXITING 24160 1726853549.64814: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853549.65795: done with get_vars() 24160 1726853549.65811: done getting variables 24160 1726853549.65850: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 13:32:29 -0400 (0:00:00.039) 0:00:26.061 ****** 24160 1726853549.65877: entering _queue_task() for managed_node1/set_fact 24160 1726853549.66068: worker is 1 (out of 1 available) 24160 1726853549.66083: exiting _queue_task() for managed_node1/set_fact 24160 1726853549.66097: done queuing things up, now waiting for results queue to drain 24160 1726853549.66099: waiting for pending results... 24160 1726853549.66305: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 24160 1726853549.66395: in run() - task 02083763-bbaf-5676-4eb4-000000000565 24160 1726853549.66406: variable 'ansible_search_path' from source: unknown 24160 1726853549.66409: variable 'ansible_search_path' from source: unknown 24160 1726853549.66437: calling self._execute() 24160 1726853549.66507: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853549.66510: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853549.66519: variable 'omit' from source: magic vars 24160 1726853549.66784: variable 'ansible_distribution_major_version' from source: facts 24160 1726853549.66799: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853549.66909: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24160 1726853549.67102: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24160 1726853549.67135: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24160 1726853549.67177: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24160 1726853549.67184: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24160 1726853549.67247: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24160 1726853549.67266: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24160 1726853549.67285: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853549.67303: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24160 1726853549.67369: variable '__network_is_ostree' from source: set_fact 24160 1726853549.67376: Evaluated conditional (not __network_is_ostree is defined): False 24160 1726853549.67379: when evaluation is False, skipping this task 24160 1726853549.67382: _execute() done 24160 1726853549.67384: dumping result to json 24160 1726853549.67389: done dumping result, returning 24160 1726853549.67395: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [02083763-bbaf-5676-4eb4-000000000565] 24160 1726853549.67400: sending task result for task 02083763-bbaf-5676-4eb4-000000000565 24160 1726853549.67482: done sending task result for task 02083763-bbaf-5676-4eb4-000000000565 24160 1726853549.67485: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 24160 1726853549.67557: no more pending results, returning what we have 24160 1726853549.67560: results queue empty 24160 1726853549.67561: checking for any_errors_fatal 24160 1726853549.67567: done checking for any_errors_fatal 24160 1726853549.67567: checking for max_fail_percentage 24160 1726853549.67569: done checking for max_fail_percentage 24160 1726853549.67570: checking to see if all hosts have failed and the running result is not ok 24160 1726853549.67573: done checking to see if all hosts have failed 24160 1726853549.67574: getting the remaining hosts for this loop 24160 1726853549.67575: done getting the remaining hosts for this loop 24160 1726853549.67578: getting the next task for host managed_node1 24160 1726853549.67585: done getting next task for host managed_node1 24160 1726853549.67587: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 24160 1726853549.67590: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853549.67602: getting variables 24160 1726853549.67603: in VariableManager get_vars() 24160 1726853549.67633: Calling all_inventory to load vars for managed_node1 24160 1726853549.67635: Calling groups_inventory to load vars for managed_node1 24160 1726853549.67637: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853549.67645: Calling all_plugins_play to load vars for managed_node1 24160 1726853549.67647: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853549.67649: Calling groups_plugins_play to load vars for managed_node1 24160 1726853549.68405: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853549.69283: done with get_vars() 24160 1726853549.69299: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 13:32:29 -0400 (0:00:00.034) 0:00:26.096 ****** 24160 1726853549.69364: entering _queue_task() for managed_node1/service_facts 24160 1726853549.69593: worker is 1 (out of 1 available) 24160 1726853549.69606: exiting _queue_task() for managed_node1/service_facts 24160 1726853549.69619: done queuing things up, now waiting for results queue to drain 24160 1726853549.69620: waiting for pending results... 24160 1726853549.69806: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running 24160 1726853549.69898: in run() - task 02083763-bbaf-5676-4eb4-000000000567 24160 1726853549.69911: variable 'ansible_search_path' from source: unknown 24160 1726853549.69914: variable 'ansible_search_path' from source: unknown 24160 1726853549.69942: calling self._execute() 24160 1726853549.70016: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853549.70020: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853549.70029: variable 'omit' from source: magic vars 24160 1726853549.70310: variable 'ansible_distribution_major_version' from source: facts 24160 1726853549.70321: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853549.70326: variable 'omit' from source: magic vars 24160 1726853549.70357: variable 'omit' from source: magic vars 24160 1726853549.70386: variable 'omit' from source: magic vars 24160 1726853549.70420: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24160 1726853549.70446: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24160 1726853549.70464: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24160 1726853549.70478: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853549.70489: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853549.70515: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24160 1726853549.70519: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853549.70522: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853549.70591: Set connection var ansible_shell_executable to /bin/sh 24160 1726853549.70595: Set connection var ansible_pipelining to False 24160 1726853549.70600: Set connection var ansible_connection to ssh 24160 1726853549.70602: Set connection var ansible_shell_type to sh 24160 1726853549.70616: Set connection var ansible_module_compression to ZIP_DEFLATED 24160 1726853549.70619: Set connection var ansible_timeout to 10 24160 1726853549.70634: variable 'ansible_shell_executable' from source: unknown 24160 1726853549.70637: variable 'ansible_connection' from source: unknown 24160 1726853549.70640: variable 'ansible_module_compression' from source: unknown 24160 1726853549.70643: variable 'ansible_shell_type' from source: unknown 24160 1726853549.70645: variable 'ansible_shell_executable' from source: unknown 24160 1726853549.70647: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853549.70651: variable 'ansible_pipelining' from source: unknown 24160 1726853549.70654: variable 'ansible_timeout' from source: unknown 24160 1726853549.70660: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853549.70804: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 24160 1726853549.70811: variable 'omit' from source: magic vars 24160 1726853549.70816: starting attempt loop 24160 1726853549.70821: running the handler 24160 1726853549.70835: _low_level_execute_command(): starting 24160 1726853549.70841: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24160 1726853549.71351: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853549.71358: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853549.71361: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853549.71363: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853549.71417: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853549.71420: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853549.71422: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853549.71477: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853549.73136: stdout chunk (state=3): >>>/root <<< 24160 1726853549.73240: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853549.73267: stderr chunk (state=3): >>><<< 24160 1726853549.73272: stdout chunk (state=3): >>><<< 24160 1726853549.73291: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853549.73303: _low_level_execute_command(): starting 24160 1726853549.73311: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853549.7329009-25418-128876376283792 `" && echo ansible-tmp-1726853549.7329009-25418-128876376283792="` echo /root/.ansible/tmp/ansible-tmp-1726853549.7329009-25418-128876376283792 `" ) && sleep 0' 24160 1726853549.73732: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853549.73736: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 24160 1726853549.73739: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 24160 1726853549.73750: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24160 1726853549.73753: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853549.73797: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853549.73803: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853549.73806: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853549.73844: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853549.75705: stdout chunk (state=3): >>>ansible-tmp-1726853549.7329009-25418-128876376283792=/root/.ansible/tmp/ansible-tmp-1726853549.7329009-25418-128876376283792 <<< 24160 1726853549.75810: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853549.75834: stderr chunk (state=3): >>><<< 24160 1726853549.75837: stdout chunk (state=3): >>><<< 24160 1726853549.75849: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853549.7329009-25418-128876376283792=/root/.ansible/tmp/ansible-tmp-1726853549.7329009-25418-128876376283792 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853549.75885: variable 'ansible_module_compression' from source: unknown 24160 1726853549.75920: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24160jdl187cr/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 24160 1726853549.75958: variable 'ansible_facts' from source: unknown 24160 1726853549.76008: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853549.7329009-25418-128876376283792/AnsiballZ_service_facts.py 24160 1726853549.76104: Sending initial data 24160 1726853549.76107: Sent initial data (162 bytes) 24160 1726853549.76534: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853549.76537: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 24160 1726853549.76540: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853549.76542: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853549.76544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853549.76602: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853549.76605: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853549.76640: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853549.78177: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 24160 1726853549.78182: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24160 1726853549.78212: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24160 1726853549.78253: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24160jdl187cr/tmp85fsnv4e /root/.ansible/tmp/ansible-tmp-1726853549.7329009-25418-128876376283792/AnsiballZ_service_facts.py <<< 24160 1726853549.78257: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853549.7329009-25418-128876376283792/AnsiballZ_service_facts.py" <<< 24160 1726853549.78296: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24160jdl187cr/tmp85fsnv4e" to remote "/root/.ansible/tmp/ansible-tmp-1726853549.7329009-25418-128876376283792/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853549.7329009-25418-128876376283792/AnsiballZ_service_facts.py" <<< 24160 1726853549.78846: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853549.78881: stderr chunk (state=3): >>><<< 24160 1726853549.78884: stdout chunk (state=3): >>><<< 24160 1726853549.78937: done transferring module to remote 24160 1726853549.78946: _low_level_execute_command(): starting 24160 1726853549.78957: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853549.7329009-25418-128876376283792/ /root/.ansible/tmp/ansible-tmp-1726853549.7329009-25418-128876376283792/AnsiballZ_service_facts.py && sleep 0' 24160 1726853549.79373: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853549.79376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 24160 1726853549.79380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853549.79383: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853549.79389: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24160 1726853549.79391: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853549.79434: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853549.79438: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853549.79483: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853549.81190: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853549.81211: stderr chunk (state=3): >>><<< 24160 1726853549.81215: stdout chunk (state=3): >>><<< 24160 1726853549.81229: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853549.81232: _low_level_execute_command(): starting 24160 1726853549.81235: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853549.7329009-25418-128876376283792/AnsiballZ_service_facts.py && sleep 0' 24160 1726853549.81653: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853549.81656: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 24160 1726853549.81660: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853549.81662: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853549.81664: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853549.81715: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853549.81718: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853549.81763: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853551.34323: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "in<<< 24160 1726853551.34339: stdout chunk (state=3): >>>active", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 24160 1726853551.35697: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853551.35987: stderr chunk (state=3): >>>Shared connection to 10.31.45.153 closed. <<< 24160 1726853551.36019: stderr chunk (state=3): >>><<< 24160 1726853551.36027: stdout chunk (state=3): >>><<< 24160 1726853551.36101: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 24160 1726853551.38209: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853549.7329009-25418-128876376283792/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24160 1726853551.38229: _low_level_execute_command(): starting 24160 1726853551.38239: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853549.7329009-25418-128876376283792/ > /dev/null 2>&1 && sleep 0' 24160 1726853551.39589: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853551.39724: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853551.39743: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853551.39891: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853551.40025: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853551.41886: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853551.41896: stdout chunk (state=3): >>><<< 24160 1726853551.41906: stderr chunk (state=3): >>><<< 24160 1726853551.42178: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853551.42182: handler run complete 24160 1726853551.42676: variable 'ansible_facts' from source: unknown 24160 1726853551.42680: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853551.43620: variable 'ansible_facts' from source: unknown 24160 1726853551.43909: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853551.44676: attempt loop complete, returning result 24160 1726853551.44679: _execute() done 24160 1726853551.44682: dumping result to json 24160 1726853551.44684: done dumping result, returning 24160 1726853551.44687: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running [02083763-bbaf-5676-4eb4-000000000567] 24160 1726853551.44689: sending task result for task 02083763-bbaf-5676-4eb4-000000000567 24160 1726853551.46520: done sending task result for task 02083763-bbaf-5676-4eb4-000000000567 24160 1726853551.46524: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 24160 1726853551.46629: no more pending results, returning what we have 24160 1726853551.46633: results queue empty 24160 1726853551.46634: checking for any_errors_fatal 24160 1726853551.46637: done checking for any_errors_fatal 24160 1726853551.46638: checking for max_fail_percentage 24160 1726853551.46639: done checking for max_fail_percentage 24160 1726853551.46640: checking to see if all hosts have failed and the running result is not ok 24160 1726853551.46641: done checking to see if all hosts have failed 24160 1726853551.46642: getting the remaining hosts for this loop 24160 1726853551.46643: done getting the remaining hosts for this loop 24160 1726853551.46646: getting the next task for host managed_node1 24160 1726853551.46652: done getting next task for host managed_node1 24160 1726853551.46655: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 24160 1726853551.46657: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853551.46667: getting variables 24160 1726853551.46668: in VariableManager get_vars() 24160 1726853551.46698: Calling all_inventory to load vars for managed_node1 24160 1726853551.46700: Calling groups_inventory to load vars for managed_node1 24160 1726853551.46703: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853551.46711: Calling all_plugins_play to load vars for managed_node1 24160 1726853551.46713: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853551.46716: Calling groups_plugins_play to load vars for managed_node1 24160 1726853551.49329: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853551.53040: done with get_vars() 24160 1726853551.53070: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 13:32:31 -0400 (0:00:01.838) 0:00:27.934 ****** 24160 1726853551.53173: entering _queue_task() for managed_node1/package_facts 24160 1726853551.53527: worker is 1 (out of 1 available) 24160 1726853551.53541: exiting _queue_task() for managed_node1/package_facts 24160 1726853551.53553: done queuing things up, now waiting for results queue to drain 24160 1726853551.53554: waiting for pending results... 24160 1726853551.53989: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 24160 1726853551.53995: in run() - task 02083763-bbaf-5676-4eb4-000000000568 24160 1726853551.53998: variable 'ansible_search_path' from source: unknown 24160 1726853551.54001: variable 'ansible_search_path' from source: unknown 24160 1726853551.54030: calling self._execute() 24160 1726853551.54134: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853551.54138: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853551.54149: variable 'omit' from source: magic vars 24160 1726853551.54536: variable 'ansible_distribution_major_version' from source: facts 24160 1726853551.54550: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853551.54559: variable 'omit' from source: magic vars 24160 1726853551.54621: variable 'omit' from source: magic vars 24160 1726853551.54659: variable 'omit' from source: magic vars 24160 1726853551.54814: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24160 1726853551.54818: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24160 1726853551.54822: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24160 1726853551.54825: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853551.54828: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853551.54831: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24160 1726853551.55189: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853551.55193: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853551.55196: Set connection var ansible_shell_executable to /bin/sh 24160 1726853551.55198: Set connection var ansible_pipelining to False 24160 1726853551.55201: Set connection var ansible_connection to ssh 24160 1726853551.55203: Set connection var ansible_shell_type to sh 24160 1726853551.55205: Set connection var ansible_module_compression to ZIP_DEFLATED 24160 1726853551.55207: Set connection var ansible_timeout to 10 24160 1726853551.55209: variable 'ansible_shell_executable' from source: unknown 24160 1726853551.55211: variable 'ansible_connection' from source: unknown 24160 1726853551.55213: variable 'ansible_module_compression' from source: unknown 24160 1726853551.55215: variable 'ansible_shell_type' from source: unknown 24160 1726853551.55217: variable 'ansible_shell_executable' from source: unknown 24160 1726853551.55219: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853551.55221: variable 'ansible_pipelining' from source: unknown 24160 1726853551.55222: variable 'ansible_timeout' from source: unknown 24160 1726853551.55224: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853551.55241: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 24160 1726853551.55244: variable 'omit' from source: magic vars 24160 1726853551.55251: starting attempt loop 24160 1726853551.55258: running the handler 24160 1726853551.55275: _low_level_execute_command(): starting 24160 1726853551.55282: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24160 1726853551.56064: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853551.56119: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853551.56131: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853551.56172: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853551.56284: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853551.57917: stdout chunk (state=3): >>>/root <<< 24160 1726853551.58020: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853551.58074: stderr chunk (state=3): >>><<< 24160 1726853551.58079: stdout chunk (state=3): >>><<< 24160 1726853551.58102: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853551.58116: _low_level_execute_command(): starting 24160 1726853551.58124: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853551.581024-25508-44578599168015 `" && echo ansible-tmp-1726853551.581024-25508-44578599168015="` echo /root/.ansible/tmp/ansible-tmp-1726853551.581024-25508-44578599168015 `" ) && sleep 0' 24160 1726853551.58731: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853551.58766: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853551.58770: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853551.58774: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24160 1726853551.58804: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 24160 1726853551.58807: stderr chunk (state=3): >>>debug2: match not found <<< 24160 1726853551.58983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853551.58987: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853551.58990: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853551.60885: stdout chunk (state=3): >>>ansible-tmp-1726853551.581024-25508-44578599168015=/root/.ansible/tmp/ansible-tmp-1726853551.581024-25508-44578599168015 <<< 24160 1726853551.61177: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853551.61180: stdout chunk (state=3): >>><<< 24160 1726853551.61182: stderr chunk (state=3): >>><<< 24160 1726853551.61185: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853551.581024-25508-44578599168015=/root/.ansible/tmp/ansible-tmp-1726853551.581024-25508-44578599168015 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853551.61188: variable 'ansible_module_compression' from source: unknown 24160 1726853551.61190: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24160jdl187cr/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 24160 1726853551.61199: variable 'ansible_facts' from source: unknown 24160 1726853551.61380: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853551.581024-25508-44578599168015/AnsiballZ_package_facts.py 24160 1726853551.61588: Sending initial data 24160 1726853551.61592: Sent initial data (160 bytes) 24160 1726853551.62110: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853551.62120: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853551.62131: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853551.62191: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853551.62238: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853551.62289: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853551.62390: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853551.62459: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853551.64077: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24160 1726853551.64141: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24160 1726853551.64185: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24160jdl187cr/tmpkan_xs13 /root/.ansible/tmp/ansible-tmp-1726853551.581024-25508-44578599168015/AnsiballZ_package_facts.py <<< 24160 1726853551.64188: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853551.581024-25508-44578599168015/AnsiballZ_package_facts.py" <<< 24160 1726853551.64269: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24160jdl187cr/tmpkan_xs13" to remote "/root/.ansible/tmp/ansible-tmp-1726853551.581024-25508-44578599168015/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853551.581024-25508-44578599168015/AnsiballZ_package_facts.py" <<< 24160 1726853551.65843: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853551.65847: stderr chunk (state=3): >>><<< 24160 1726853551.65856: stdout chunk (state=3): >>><<< 24160 1726853551.66077: done transferring module to remote 24160 1726853551.66080: _low_level_execute_command(): starting 24160 1726853551.66083: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853551.581024-25508-44578599168015/ /root/.ansible/tmp/ansible-tmp-1726853551.581024-25508-44578599168015/AnsiballZ_package_facts.py && sleep 0' 24160 1726853551.66549: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853551.66564: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853551.66579: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853551.66787: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853551.68421: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853551.68425: stderr chunk (state=3): >>><<< 24160 1726853551.68430: stdout chunk (state=3): >>><<< 24160 1726853551.68447: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853551.68450: _low_level_execute_command(): starting 24160 1726853551.68455: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853551.581024-25508-44578599168015/AnsiballZ_package_facts.py && sleep 0' 24160 1726853551.69015: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853551.69026: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853551.69037: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853551.69050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24160 1726853551.69065: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 24160 1726853551.69075: stderr chunk (state=3): >>>debug2: match not found <<< 24160 1726853551.69084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853551.69097: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24160 1726853551.69106: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address <<< 24160 1726853551.69113: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24160 1726853551.69120: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853551.69143: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853551.69294: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24160 1726853551.69298: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 24160 1726853551.69302: stderr chunk (state=3): >>>debug2: match found <<< 24160 1726853551.69305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853551.69307: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853551.69309: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853551.69311: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853551.69349: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853552.13470: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 24160 1726853552.13519: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "rele<<< 24160 1726853552.13534: stdout chunk (state=3): >>>ase": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null,<<< 24160 1726853552.13543: stdout chunk (state=3): >>> "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certm<<< 24160 1726853552.13549: stdout chunk (state=3): >>>ap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10",<<< 24160 1726853552.13569: stdout chunk (state=3): >>> "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arc<<< 24160 1726853552.13586: stdout chunk (state=3): >>>h": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.7<<< 24160 1726853552.13605: stdout chunk (state=3): >>>3.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"<<< 24160 1726853552.13644: stdout chunk (state=3): >>>}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-resc<<< 24160 1726853552.13653: stdout chunk (state=3): >>>ue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1<<< 24160 1726853552.13662: stdout chunk (state=3): >>>.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10<<< 24160 1726853552.13679: stdout chunk (state=3): >>>", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.<<< 24160 1726853552.13685: stdout chunk (state=3): >>>26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "c<<< 24160 1726853552.13702: stdout chunk (state=3): >>>loud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 24160 1726853552.15558: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 24160 1726853552.15562: stdout chunk (state=3): >>><<< 24160 1726853552.15565: stderr chunk (state=3): >>><<< 24160 1726853552.15702: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 24160 1726853552.18395: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853551.581024-25508-44578599168015/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24160 1726853552.18399: _low_level_execute_command(): starting 24160 1726853552.18401: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853551.581024-25508-44578599168015/ > /dev/null 2>&1 && sleep 0' 24160 1726853552.18921: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853552.18936: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853552.18949: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853552.18965: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24160 1726853552.19058: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853552.19076: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853552.19100: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853552.19198: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853552.21057: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853552.21072: stdout chunk (state=3): >>><<< 24160 1726853552.21090: stderr chunk (state=3): >>><<< 24160 1726853552.21116: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853552.21129: handler run complete 24160 1726853552.22015: variable 'ansible_facts' from source: unknown 24160 1726853552.22675: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853552.32805: variable 'ansible_facts' from source: unknown 24160 1726853552.33474: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853552.34274: attempt loop complete, returning result 24160 1726853552.34285: _execute() done 24160 1726853552.34290: dumping result to json 24160 1726853552.34756: done dumping result, returning 24160 1726853552.34760: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [02083763-bbaf-5676-4eb4-000000000568] 24160 1726853552.34762: sending task result for task 02083763-bbaf-5676-4eb4-000000000568 24160 1726853552.37145: done sending task result for task 02083763-bbaf-5676-4eb4-000000000568 24160 1726853552.37148: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 24160 1726853552.37295: no more pending results, returning what we have 24160 1726853552.37297: results queue empty 24160 1726853552.37298: checking for any_errors_fatal 24160 1726853552.37303: done checking for any_errors_fatal 24160 1726853552.37304: checking for max_fail_percentage 24160 1726853552.37305: done checking for max_fail_percentage 24160 1726853552.37306: checking to see if all hosts have failed and the running result is not ok 24160 1726853552.37306: done checking to see if all hosts have failed 24160 1726853552.37307: getting the remaining hosts for this loop 24160 1726853552.37308: done getting the remaining hosts for this loop 24160 1726853552.37312: getting the next task for host managed_node1 24160 1726853552.37318: done getting next task for host managed_node1 24160 1726853552.37321: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 24160 1726853552.37323: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853552.37332: getting variables 24160 1726853552.37333: in VariableManager get_vars() 24160 1726853552.37369: Calling all_inventory to load vars for managed_node1 24160 1726853552.37374: Calling groups_inventory to load vars for managed_node1 24160 1726853552.37376: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853552.37384: Calling all_plugins_play to load vars for managed_node1 24160 1726853552.37387: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853552.37390: Calling groups_plugins_play to load vars for managed_node1 24160 1726853552.43023: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853552.44635: done with get_vars() 24160 1726853552.44662: done getting variables 24160 1726853552.44713: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 13:32:32 -0400 (0:00:00.915) 0:00:28.850 ****** 24160 1726853552.44747: entering _queue_task() for managed_node1/debug 24160 1726853552.45208: worker is 1 (out of 1 available) 24160 1726853552.45222: exiting _queue_task() for managed_node1/debug 24160 1726853552.45234: done queuing things up, now waiting for results queue to drain 24160 1726853552.45236: waiting for pending results... 24160 1726853552.45477: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 24160 1726853552.45633: in run() - task 02083763-bbaf-5676-4eb4-000000000085 24160 1726853552.45637: variable 'ansible_search_path' from source: unknown 24160 1726853552.45640: variable 'ansible_search_path' from source: unknown 24160 1726853552.45740: calling self._execute() 24160 1726853552.45787: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853552.45802: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853552.45818: variable 'omit' from source: magic vars 24160 1726853552.46222: variable 'ansible_distribution_major_version' from source: facts 24160 1726853552.46234: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853552.46239: variable 'omit' from source: magic vars 24160 1726853552.46268: variable 'omit' from source: magic vars 24160 1726853552.46344: variable 'network_provider' from source: set_fact 24160 1726853552.46355: variable 'omit' from source: magic vars 24160 1726853552.46392: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24160 1726853552.46418: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24160 1726853552.46435: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24160 1726853552.46449: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853552.46460: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853552.46485: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24160 1726853552.46496: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853552.46502: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853552.46563: Set connection var ansible_shell_executable to /bin/sh 24160 1726853552.46568: Set connection var ansible_pipelining to False 24160 1726853552.46573: Set connection var ansible_connection to ssh 24160 1726853552.46575: Set connection var ansible_shell_type to sh 24160 1726853552.46583: Set connection var ansible_module_compression to ZIP_DEFLATED 24160 1726853552.46590: Set connection var ansible_timeout to 10 24160 1726853552.46610: variable 'ansible_shell_executable' from source: unknown 24160 1726853552.46613: variable 'ansible_connection' from source: unknown 24160 1726853552.46616: variable 'ansible_module_compression' from source: unknown 24160 1726853552.46618: variable 'ansible_shell_type' from source: unknown 24160 1726853552.46620: variable 'ansible_shell_executable' from source: unknown 24160 1726853552.46622: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853552.46624: variable 'ansible_pipelining' from source: unknown 24160 1726853552.46626: variable 'ansible_timeout' from source: unknown 24160 1726853552.46632: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853552.46734: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 24160 1726853552.46743: variable 'omit' from source: magic vars 24160 1726853552.46748: starting attempt loop 24160 1726853552.46751: running the handler 24160 1726853552.46790: handler run complete 24160 1726853552.46800: attempt loop complete, returning result 24160 1726853552.46803: _execute() done 24160 1726853552.46805: dumping result to json 24160 1726853552.46808: done dumping result, returning 24160 1726853552.46816: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [02083763-bbaf-5676-4eb4-000000000085] 24160 1726853552.46819: sending task result for task 02083763-bbaf-5676-4eb4-000000000085 24160 1726853552.46901: done sending task result for task 02083763-bbaf-5676-4eb4-000000000085 24160 1726853552.46903: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: Using network provider: nm 24160 1726853552.46986: no more pending results, returning what we have 24160 1726853552.46989: results queue empty 24160 1726853552.46990: checking for any_errors_fatal 24160 1726853552.47001: done checking for any_errors_fatal 24160 1726853552.47002: checking for max_fail_percentage 24160 1726853552.47003: done checking for max_fail_percentage 24160 1726853552.47004: checking to see if all hosts have failed and the running result is not ok 24160 1726853552.47005: done checking to see if all hosts have failed 24160 1726853552.47006: getting the remaining hosts for this loop 24160 1726853552.47007: done getting the remaining hosts for this loop 24160 1726853552.47010: getting the next task for host managed_node1 24160 1726853552.47015: done getting next task for host managed_node1 24160 1726853552.47019: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 24160 1726853552.47021: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853552.47029: getting variables 24160 1726853552.47031: in VariableManager get_vars() 24160 1726853552.47063: Calling all_inventory to load vars for managed_node1 24160 1726853552.47066: Calling groups_inventory to load vars for managed_node1 24160 1726853552.47068: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853552.47077: Calling all_plugins_play to load vars for managed_node1 24160 1726853552.47080: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853552.47082: Calling groups_plugins_play to load vars for managed_node1 24160 1726853552.47837: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853552.49228: done with get_vars() 24160 1726853552.49245: done getting variables 24160 1726853552.49289: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 13:32:32 -0400 (0:00:00.045) 0:00:28.895 ****** 24160 1726853552.49312: entering _queue_task() for managed_node1/fail 24160 1726853552.49537: worker is 1 (out of 1 available) 24160 1726853552.49549: exiting _queue_task() for managed_node1/fail 24160 1726853552.49577: done queuing things up, now waiting for results queue to drain 24160 1726853552.49579: waiting for pending results... 24160 1726853552.49759: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 24160 1726853552.49845: in run() - task 02083763-bbaf-5676-4eb4-000000000086 24160 1726853552.49856: variable 'ansible_search_path' from source: unknown 24160 1726853552.49863: variable 'ansible_search_path' from source: unknown 24160 1726853552.49895: calling self._execute() 24160 1726853552.49972: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853552.49977: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853552.49986: variable 'omit' from source: magic vars 24160 1726853552.50280: variable 'ansible_distribution_major_version' from source: facts 24160 1726853552.50289: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853552.50373: variable 'network_state' from source: role '' defaults 24160 1726853552.50380: Evaluated conditional (network_state != {}): False 24160 1726853552.50384: when evaluation is False, skipping this task 24160 1726853552.50386: _execute() done 24160 1726853552.50389: dumping result to json 24160 1726853552.50391: done dumping result, returning 24160 1726853552.50398: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [02083763-bbaf-5676-4eb4-000000000086] 24160 1726853552.50404: sending task result for task 02083763-bbaf-5676-4eb4-000000000086 24160 1726853552.50496: done sending task result for task 02083763-bbaf-5676-4eb4-000000000086 24160 1726853552.50498: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 24160 1726853552.50541: no more pending results, returning what we have 24160 1726853552.50545: results queue empty 24160 1726853552.50545: checking for any_errors_fatal 24160 1726853552.50550: done checking for any_errors_fatal 24160 1726853552.50551: checking for max_fail_percentage 24160 1726853552.50553: done checking for max_fail_percentage 24160 1726853552.50554: checking to see if all hosts have failed and the running result is not ok 24160 1726853552.50554: done checking to see if all hosts have failed 24160 1726853552.50555: getting the remaining hosts for this loop 24160 1726853552.50556: done getting the remaining hosts for this loop 24160 1726853552.50560: getting the next task for host managed_node1 24160 1726853552.50565: done getting next task for host managed_node1 24160 1726853552.50569: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 24160 1726853552.50573: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853552.50587: getting variables 24160 1726853552.50588: in VariableManager get_vars() 24160 1726853552.50618: Calling all_inventory to load vars for managed_node1 24160 1726853552.50620: Calling groups_inventory to load vars for managed_node1 24160 1726853552.50622: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853552.50631: Calling all_plugins_play to load vars for managed_node1 24160 1726853552.50633: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853552.50636: Calling groups_plugins_play to load vars for managed_node1 24160 1726853552.51782: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853552.52969: done with get_vars() 24160 1726853552.52985: done getting variables 24160 1726853552.53027: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 13:32:32 -0400 (0:00:00.037) 0:00:28.933 ****** 24160 1726853552.53049: entering _queue_task() for managed_node1/fail 24160 1726853552.53257: worker is 1 (out of 1 available) 24160 1726853552.53269: exiting _queue_task() for managed_node1/fail 24160 1726853552.53284: done queuing things up, now waiting for results queue to drain 24160 1726853552.53286: waiting for pending results... 24160 1726853552.53456: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 24160 1726853552.53535: in run() - task 02083763-bbaf-5676-4eb4-000000000087 24160 1726853552.53546: variable 'ansible_search_path' from source: unknown 24160 1726853552.53549: variable 'ansible_search_path' from source: unknown 24160 1726853552.53582: calling self._execute() 24160 1726853552.53655: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853552.53662: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853552.53672: variable 'omit' from source: magic vars 24160 1726853552.53942: variable 'ansible_distribution_major_version' from source: facts 24160 1726853552.53953: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853552.54034: variable 'network_state' from source: role '' defaults 24160 1726853552.54042: Evaluated conditional (network_state != {}): False 24160 1726853552.54045: when evaluation is False, skipping this task 24160 1726853552.54048: _execute() done 24160 1726853552.54051: dumping result to json 24160 1726853552.54055: done dumping result, returning 24160 1726853552.54067: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [02083763-bbaf-5676-4eb4-000000000087] 24160 1726853552.54070: sending task result for task 02083763-bbaf-5676-4eb4-000000000087 24160 1726853552.54198: done sending task result for task 02083763-bbaf-5676-4eb4-000000000087 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 24160 1726853552.54277: no more pending results, returning what we have 24160 1726853552.54280: results queue empty 24160 1726853552.54281: checking for any_errors_fatal 24160 1726853552.54285: done checking for any_errors_fatal 24160 1726853552.54286: checking for max_fail_percentage 24160 1726853552.54287: done checking for max_fail_percentage 24160 1726853552.54288: checking to see if all hosts have failed and the running result is not ok 24160 1726853552.54289: done checking to see if all hosts have failed 24160 1726853552.54289: getting the remaining hosts for this loop 24160 1726853552.54290: done getting the remaining hosts for this loop 24160 1726853552.54293: getting the next task for host managed_node1 24160 1726853552.54299: done getting next task for host managed_node1 24160 1726853552.54338: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 24160 1726853552.54340: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853552.54350: WORKER PROCESS EXITING 24160 1726853552.54358: getting variables 24160 1726853552.54359: in VariableManager get_vars() 24160 1726853552.54407: Calling all_inventory to load vars for managed_node1 24160 1726853552.54419: Calling groups_inventory to load vars for managed_node1 24160 1726853552.54421: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853552.54428: Calling all_plugins_play to load vars for managed_node1 24160 1726853552.54429: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853552.54431: Calling groups_plugins_play to load vars for managed_node1 24160 1726853552.55638: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853552.56861: done with get_vars() 24160 1726853552.56879: done getting variables 24160 1726853552.56918: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 13:32:32 -0400 (0:00:00.038) 0:00:28.972 ****** 24160 1726853552.56939: entering _queue_task() for managed_node1/fail 24160 1726853552.57144: worker is 1 (out of 1 available) 24160 1726853552.57159: exiting _queue_task() for managed_node1/fail 24160 1726853552.57172: done queuing things up, now waiting for results queue to drain 24160 1726853552.57174: waiting for pending results... 24160 1726853552.57343: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 24160 1726853552.57425: in run() - task 02083763-bbaf-5676-4eb4-000000000088 24160 1726853552.57435: variable 'ansible_search_path' from source: unknown 24160 1726853552.57438: variable 'ansible_search_path' from source: unknown 24160 1726853552.57467: calling self._execute() 24160 1726853552.57542: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853552.57548: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853552.57559: variable 'omit' from source: magic vars 24160 1726853552.57829: variable 'ansible_distribution_major_version' from source: facts 24160 1726853552.57839: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853552.58012: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24160 1726853552.59758: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24160 1726853552.59805: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24160 1726853552.59834: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24160 1726853552.59860: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24160 1726853552.59882: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24160 1726853552.59939: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853552.59961: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853552.59983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853552.60008: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853552.60020: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853552.60087: variable 'ansible_distribution_major_version' from source: facts 24160 1726853552.60099: Evaluated conditional (ansible_distribution_major_version | int > 9): True 24160 1726853552.60175: variable 'ansible_distribution' from source: facts 24160 1726853552.60181: variable '__network_rh_distros' from source: role '' defaults 24160 1726853552.60190: Evaluated conditional (ansible_distribution in __network_rh_distros): True 24160 1726853552.60345: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853552.60364: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853552.60384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853552.60410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853552.60420: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853552.60455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853552.60469: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853552.60488: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853552.60512: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853552.60523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853552.60551: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853552.60570: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853552.60588: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853552.60613: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853552.60623: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853552.60814: variable 'network_connections' from source: play vars 24160 1726853552.60823: variable 'profile' from source: play vars 24160 1726853552.60873: variable 'profile' from source: play vars 24160 1726853552.60877: variable 'interface' from source: set_fact 24160 1726853552.60922: variable 'interface' from source: set_fact 24160 1726853552.60930: variable 'network_state' from source: role '' defaults 24160 1726853552.60978: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24160 1726853552.61100: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24160 1726853552.61127: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24160 1726853552.61149: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24160 1726853552.61173: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24160 1726853552.61202: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24160 1726853552.61222: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24160 1726853552.61239: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853552.61276: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24160 1726853552.61280: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 24160 1726853552.61285: when evaluation is False, skipping this task 24160 1726853552.61287: _execute() done 24160 1726853552.61289: dumping result to json 24160 1726853552.61292: done dumping result, returning 24160 1726853552.61294: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [02083763-bbaf-5676-4eb4-000000000088] 24160 1726853552.61300: sending task result for task 02083763-bbaf-5676-4eb4-000000000088 24160 1726853552.61381: done sending task result for task 02083763-bbaf-5676-4eb4-000000000088 24160 1726853552.61383: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 24160 1726853552.61446: no more pending results, returning what we have 24160 1726853552.61449: results queue empty 24160 1726853552.61450: checking for any_errors_fatal 24160 1726853552.61458: done checking for any_errors_fatal 24160 1726853552.61459: checking for max_fail_percentage 24160 1726853552.61460: done checking for max_fail_percentage 24160 1726853552.61461: checking to see if all hosts have failed and the running result is not ok 24160 1726853552.61462: done checking to see if all hosts have failed 24160 1726853552.61462: getting the remaining hosts for this loop 24160 1726853552.61463: done getting the remaining hosts for this loop 24160 1726853552.61467: getting the next task for host managed_node1 24160 1726853552.61474: done getting next task for host managed_node1 24160 1726853552.61477: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 24160 1726853552.61479: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853552.61491: getting variables 24160 1726853552.61498: in VariableManager get_vars() 24160 1726853552.61530: Calling all_inventory to load vars for managed_node1 24160 1726853552.61532: Calling groups_inventory to load vars for managed_node1 24160 1726853552.61534: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853552.61542: Calling all_plugins_play to load vars for managed_node1 24160 1726853552.61545: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853552.61547: Calling groups_plugins_play to load vars for managed_node1 24160 1726853552.62327: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853552.63226: done with get_vars() 24160 1726853552.63243: done getting variables 24160 1726853552.63290: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 13:32:32 -0400 (0:00:00.063) 0:00:29.035 ****** 24160 1726853552.63311: entering _queue_task() for managed_node1/dnf 24160 1726853552.63523: worker is 1 (out of 1 available) 24160 1726853552.63537: exiting _queue_task() for managed_node1/dnf 24160 1726853552.63549: done queuing things up, now waiting for results queue to drain 24160 1726853552.63551: waiting for pending results... 24160 1726853552.63720: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 24160 1726853552.63790: in run() - task 02083763-bbaf-5676-4eb4-000000000089 24160 1726853552.63801: variable 'ansible_search_path' from source: unknown 24160 1726853552.63804: variable 'ansible_search_path' from source: unknown 24160 1726853552.63833: calling self._execute() 24160 1726853552.63908: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853552.63915: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853552.63923: variable 'omit' from source: magic vars 24160 1726853552.64192: variable 'ansible_distribution_major_version' from source: facts 24160 1726853552.64201: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853552.64350: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24160 1726853552.65791: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24160 1726853552.66066: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24160 1726853552.66098: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24160 1726853552.66122: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24160 1726853552.66140: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24160 1726853552.66203: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853552.66223: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853552.66242: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853552.66272: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853552.66288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853552.66356: variable 'ansible_distribution' from source: facts 24160 1726853552.66362: variable 'ansible_distribution_major_version' from source: facts 24160 1726853552.66376: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 24160 1726853552.66449: variable '__network_wireless_connections_defined' from source: role '' defaults 24160 1726853552.66535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853552.66551: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853552.66573: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853552.66599: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853552.66612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853552.66640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853552.66656: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853552.66676: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853552.66700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853552.66715: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853552.66740: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853552.66756: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853552.66776: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853552.66800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853552.66810: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853552.66910: variable 'network_connections' from source: play vars 24160 1726853552.66918: variable 'profile' from source: play vars 24160 1726853552.66969: variable 'profile' from source: play vars 24160 1726853552.66974: variable 'interface' from source: set_fact 24160 1726853552.67014: variable 'interface' from source: set_fact 24160 1726853552.67068: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24160 1726853552.67179: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24160 1726853552.67204: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24160 1726853552.67225: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24160 1726853552.67257: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24160 1726853552.67291: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24160 1726853552.67307: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24160 1726853552.67327: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853552.67344: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24160 1726853552.67384: variable '__network_team_connections_defined' from source: role '' defaults 24160 1726853552.67529: variable 'network_connections' from source: play vars 24160 1726853552.67532: variable 'profile' from source: play vars 24160 1726853552.67577: variable 'profile' from source: play vars 24160 1726853552.67582: variable 'interface' from source: set_fact 24160 1726853552.67624: variable 'interface' from source: set_fact 24160 1726853552.67643: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 24160 1726853552.67646: when evaluation is False, skipping this task 24160 1726853552.67648: _execute() done 24160 1726853552.67651: dumping result to json 24160 1726853552.67653: done dumping result, returning 24160 1726853552.67662: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [02083763-bbaf-5676-4eb4-000000000089] 24160 1726853552.67665: sending task result for task 02083763-bbaf-5676-4eb4-000000000089 24160 1726853552.67748: done sending task result for task 02083763-bbaf-5676-4eb4-000000000089 24160 1726853552.67750: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 24160 1726853552.67803: no more pending results, returning what we have 24160 1726853552.67806: results queue empty 24160 1726853552.67807: checking for any_errors_fatal 24160 1726853552.67813: done checking for any_errors_fatal 24160 1726853552.67813: checking for max_fail_percentage 24160 1726853552.67815: done checking for max_fail_percentage 24160 1726853552.67816: checking to see if all hosts have failed and the running result is not ok 24160 1726853552.67816: done checking to see if all hosts have failed 24160 1726853552.67817: getting the remaining hosts for this loop 24160 1726853552.67818: done getting the remaining hosts for this loop 24160 1726853552.67822: getting the next task for host managed_node1 24160 1726853552.67828: done getting next task for host managed_node1 24160 1726853552.67832: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 24160 1726853552.67834: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853552.67846: getting variables 24160 1726853552.67847: in VariableManager get_vars() 24160 1726853552.67883: Calling all_inventory to load vars for managed_node1 24160 1726853552.67886: Calling groups_inventory to load vars for managed_node1 24160 1726853552.67888: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853552.67896: Calling all_plugins_play to load vars for managed_node1 24160 1726853552.67899: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853552.67901: Calling groups_plugins_play to load vars for managed_node1 24160 1726853552.68812: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853552.70052: done with get_vars() 24160 1726853552.70075: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 24160 1726853552.70147: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 13:32:32 -0400 (0:00:00.068) 0:00:29.104 ****** 24160 1726853552.70178: entering _queue_task() for managed_node1/yum 24160 1726853552.70437: worker is 1 (out of 1 available) 24160 1726853552.70452: exiting _queue_task() for managed_node1/yum 24160 1726853552.70463: done queuing things up, now waiting for results queue to drain 24160 1726853552.70465: waiting for pending results... 24160 1726853552.70651: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 24160 1726853552.70730: in run() - task 02083763-bbaf-5676-4eb4-00000000008a 24160 1726853552.70743: variable 'ansible_search_path' from source: unknown 24160 1726853552.70746: variable 'ansible_search_path' from source: unknown 24160 1726853552.70777: calling self._execute() 24160 1726853552.70853: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853552.70862: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853552.70873: variable 'omit' from source: magic vars 24160 1726853552.71156: variable 'ansible_distribution_major_version' from source: facts 24160 1726853552.71169: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853552.71296: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24160 1726853552.73177: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24160 1726853552.73180: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24160 1726853552.73193: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24160 1726853552.73233: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24160 1726853552.73267: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24160 1726853552.73343: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853552.73392: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853552.73425: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853552.73476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853552.73495: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853552.73586: variable 'ansible_distribution_major_version' from source: facts 24160 1726853552.73607: Evaluated conditional (ansible_distribution_major_version | int < 8): False 24160 1726853552.73614: when evaluation is False, skipping this task 24160 1726853552.73621: _execute() done 24160 1726853552.73629: dumping result to json 24160 1726853552.73635: done dumping result, returning 24160 1726853552.73646: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [02083763-bbaf-5676-4eb4-00000000008a] 24160 1726853552.73658: sending task result for task 02083763-bbaf-5676-4eb4-00000000008a skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 24160 1726853552.73805: no more pending results, returning what we have 24160 1726853552.73809: results queue empty 24160 1726853552.73810: checking for any_errors_fatal 24160 1726853552.73817: done checking for any_errors_fatal 24160 1726853552.73818: checking for max_fail_percentage 24160 1726853552.73819: done checking for max_fail_percentage 24160 1726853552.73820: checking to see if all hosts have failed and the running result is not ok 24160 1726853552.73821: done checking to see if all hosts have failed 24160 1726853552.73821: getting the remaining hosts for this loop 24160 1726853552.73822: done getting the remaining hosts for this loop 24160 1726853552.73826: getting the next task for host managed_node1 24160 1726853552.73832: done getting next task for host managed_node1 24160 1726853552.73836: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 24160 1726853552.73838: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853552.73849: getting variables 24160 1726853552.73850: in VariableManager get_vars() 24160 1726853552.73889: Calling all_inventory to load vars for managed_node1 24160 1726853552.73891: Calling groups_inventory to load vars for managed_node1 24160 1726853552.73893: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853552.73904: Calling all_plugins_play to load vars for managed_node1 24160 1726853552.73907: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853552.73909: Calling groups_plugins_play to load vars for managed_node1 24160 1726853552.74584: done sending task result for task 02083763-bbaf-5676-4eb4-00000000008a 24160 1726853552.74588: WORKER PROCESS EXITING 24160 1726853552.75433: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853552.77735: done with get_vars() 24160 1726853552.77760: done getting variables 24160 1726853552.77822: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 13:32:32 -0400 (0:00:00.076) 0:00:29.181 ****** 24160 1726853552.77856: entering _queue_task() for managed_node1/fail 24160 1726853552.78196: worker is 1 (out of 1 available) 24160 1726853552.78210: exiting _queue_task() for managed_node1/fail 24160 1726853552.78223: done queuing things up, now waiting for results queue to drain 24160 1726853552.78224: waiting for pending results... 24160 1726853552.78689: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 24160 1726853552.78694: in run() - task 02083763-bbaf-5676-4eb4-00000000008b 24160 1726853552.78696: variable 'ansible_search_path' from source: unknown 24160 1726853552.78699: variable 'ansible_search_path' from source: unknown 24160 1726853552.78701: calling self._execute() 24160 1726853552.78765: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853552.78775: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853552.78779: variable 'omit' from source: magic vars 24160 1726853552.79082: variable 'ansible_distribution_major_version' from source: facts 24160 1726853552.79093: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853552.79176: variable '__network_wireless_connections_defined' from source: role '' defaults 24160 1726853552.79680: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24160 1726853552.82632: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24160 1726853552.82698: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24160 1726853552.82749: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24160 1726853552.82790: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24160 1726853552.82820: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24160 1726853552.82898: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853552.82932: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853552.82961: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853552.83006: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853552.83024: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853552.83076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853552.83105: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853552.83133: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853552.83177: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853552.83195: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853552.83236: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853552.83263: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853552.83292: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853552.83334: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853552.83361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853552.83525: variable 'network_connections' from source: play vars 24160 1726853552.83543: variable 'profile' from source: play vars 24160 1726853552.83617: variable 'profile' from source: play vars 24160 1726853552.83626: variable 'interface' from source: set_fact 24160 1726853552.83688: variable 'interface' from source: set_fact 24160 1726853552.83761: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24160 1726853552.83922: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24160 1726853552.83962: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24160 1726853552.83996: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24160 1726853552.84176: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24160 1726853552.84179: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24160 1726853552.84184: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24160 1726853552.84186: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853552.84188: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24160 1726853552.84207: variable '__network_team_connections_defined' from source: role '' defaults 24160 1726853552.84486: variable 'network_connections' from source: play vars 24160 1726853552.84495: variable 'profile' from source: play vars 24160 1726853552.84555: variable 'profile' from source: play vars 24160 1726853552.84564: variable 'interface' from source: set_fact 24160 1726853552.84626: variable 'interface' from source: set_fact 24160 1726853552.84654: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 24160 1726853552.84661: when evaluation is False, skipping this task 24160 1726853552.84668: _execute() done 24160 1726853552.84678: dumping result to json 24160 1726853552.84686: done dumping result, returning 24160 1726853552.84697: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [02083763-bbaf-5676-4eb4-00000000008b] 24160 1726853552.84713: sending task result for task 02083763-bbaf-5676-4eb4-00000000008b skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 24160 1726853552.84865: no more pending results, returning what we have 24160 1726853552.84868: results queue empty 24160 1726853552.84869: checking for any_errors_fatal 24160 1726853552.84879: done checking for any_errors_fatal 24160 1726853552.84879: checking for max_fail_percentage 24160 1726853552.84881: done checking for max_fail_percentage 24160 1726853552.84882: checking to see if all hosts have failed and the running result is not ok 24160 1726853552.84883: done checking to see if all hosts have failed 24160 1726853552.84883: getting the remaining hosts for this loop 24160 1726853552.84885: done getting the remaining hosts for this loop 24160 1726853552.84976: getting the next task for host managed_node1 24160 1726853552.84983: done getting next task for host managed_node1 24160 1726853552.84986: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 24160 1726853552.84988: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853552.85009: done sending task result for task 02083763-bbaf-5676-4eb4-00000000008b 24160 1726853552.85012: WORKER PROCESS EXITING 24160 1726853552.85017: getting variables 24160 1726853552.85019: in VariableManager get_vars() 24160 1726853552.85061: Calling all_inventory to load vars for managed_node1 24160 1726853552.85064: Calling groups_inventory to load vars for managed_node1 24160 1726853552.85066: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853552.85242: Calling all_plugins_play to load vars for managed_node1 24160 1726853552.85246: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853552.85249: Calling groups_plugins_play to load vars for managed_node1 24160 1726853552.87415: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853552.89015: done with get_vars() 24160 1726853552.89062: done getting variables 24160 1726853552.89312: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 13:32:32 -0400 (0:00:00.114) 0:00:29.296 ****** 24160 1726853552.89346: entering _queue_task() for managed_node1/package 24160 1726853552.90014: worker is 1 (out of 1 available) 24160 1726853552.90028: exiting _queue_task() for managed_node1/package 24160 1726853552.90043: done queuing things up, now waiting for results queue to drain 24160 1726853552.90045: waiting for pending results... 24160 1726853552.90311: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 24160 1726853552.90723: in run() - task 02083763-bbaf-5676-4eb4-00000000008c 24160 1726853552.90727: variable 'ansible_search_path' from source: unknown 24160 1726853552.90730: variable 'ansible_search_path' from source: unknown 24160 1726853552.90733: calling self._execute() 24160 1726853552.90903: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853552.90926: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853552.90948: variable 'omit' from source: magic vars 24160 1726853552.91337: variable 'ansible_distribution_major_version' from source: facts 24160 1726853552.91375: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853552.91592: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24160 1726853552.91804: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24160 1726853552.91848: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24160 1726853552.91885: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24160 1726853552.91943: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24160 1726853552.92027: variable 'network_packages' from source: role '' defaults 24160 1726853552.92102: variable '__network_provider_setup' from source: role '' defaults 24160 1726853552.92110: variable '__network_service_name_default_nm' from source: role '' defaults 24160 1726853552.92163: variable '__network_service_name_default_nm' from source: role '' defaults 24160 1726853552.92172: variable '__network_packages_default_nm' from source: role '' defaults 24160 1726853552.92215: variable '__network_packages_default_nm' from source: role '' defaults 24160 1726853552.92332: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24160 1726853552.94383: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24160 1726853552.94453: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24160 1726853552.94512: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24160 1726853552.94564: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24160 1726853552.94605: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24160 1726853552.94676: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853552.94697: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853552.94720: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853552.94745: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853552.94758: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853552.94790: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853552.94813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853552.94827: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853552.94851: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853552.94863: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853552.95010: variable '__network_packages_default_gobject_packages' from source: role '' defaults 24160 1726853552.95087: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853552.95103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853552.95119: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853552.95146: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853552.95159: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853552.95217: variable 'ansible_python' from source: facts 24160 1726853552.95236: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 24160 1726853552.95294: variable '__network_wpa_supplicant_required' from source: role '' defaults 24160 1726853552.95346: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 24160 1726853552.95429: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853552.95446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853552.95464: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853552.95495: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853552.95505: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853552.95536: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853552.95555: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853552.95578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853552.95603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853552.95613: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853552.95709: variable 'network_connections' from source: play vars 24160 1726853552.95714: variable 'profile' from source: play vars 24160 1726853552.95789: variable 'profile' from source: play vars 24160 1726853552.95792: variable 'interface' from source: set_fact 24160 1726853552.95858: variable 'interface' from source: set_fact 24160 1726853552.96083: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24160 1726853552.96086: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24160 1726853552.96089: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853552.96091: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24160 1726853552.96093: variable '__network_wireless_connections_defined' from source: role '' defaults 24160 1726853552.96355: variable 'network_connections' from source: play vars 24160 1726853552.96365: variable 'profile' from source: play vars 24160 1726853552.96475: variable 'profile' from source: play vars 24160 1726853552.96489: variable 'interface' from source: set_fact 24160 1726853552.96561: variable 'interface' from source: set_fact 24160 1726853552.96603: variable '__network_packages_default_wireless' from source: role '' defaults 24160 1726853552.96689: variable '__network_wireless_connections_defined' from source: role '' defaults 24160 1726853552.96989: variable 'network_connections' from source: play vars 24160 1726853552.97000: variable 'profile' from source: play vars 24160 1726853552.97068: variable 'profile' from source: play vars 24160 1726853552.97089: variable 'interface' from source: set_fact 24160 1726853552.97226: variable 'interface' from source: set_fact 24160 1726853552.97246: variable '__network_packages_default_team' from source: role '' defaults 24160 1726853552.97305: variable '__network_team_connections_defined' from source: role '' defaults 24160 1726853552.97512: variable 'network_connections' from source: play vars 24160 1726853552.97515: variable 'profile' from source: play vars 24160 1726853552.97558: variable 'profile' from source: play vars 24160 1726853552.97567: variable 'interface' from source: set_fact 24160 1726853552.97637: variable 'interface' from source: set_fact 24160 1726853552.97675: variable '__network_service_name_default_initscripts' from source: role '' defaults 24160 1726853552.97716: variable '__network_service_name_default_initscripts' from source: role '' defaults 24160 1726853552.97726: variable '__network_packages_default_initscripts' from source: role '' defaults 24160 1726853552.97769: variable '__network_packages_default_initscripts' from source: role '' defaults 24160 1726853552.97907: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 24160 1726853552.98220: variable 'network_connections' from source: play vars 24160 1726853552.98223: variable 'profile' from source: play vars 24160 1726853552.98269: variable 'profile' from source: play vars 24160 1726853552.98274: variable 'interface' from source: set_fact 24160 1726853552.98318: variable 'interface' from source: set_fact 24160 1726853552.98325: variable 'ansible_distribution' from source: facts 24160 1726853552.98328: variable '__network_rh_distros' from source: role '' defaults 24160 1726853552.98333: variable 'ansible_distribution_major_version' from source: facts 24160 1726853552.98344: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 24160 1726853552.98451: variable 'ansible_distribution' from source: facts 24160 1726853552.98455: variable '__network_rh_distros' from source: role '' defaults 24160 1726853552.98461: variable 'ansible_distribution_major_version' from source: facts 24160 1726853552.98477: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 24160 1726853552.98578: variable 'ansible_distribution' from source: facts 24160 1726853552.98583: variable '__network_rh_distros' from source: role '' defaults 24160 1726853552.98587: variable 'ansible_distribution_major_version' from source: facts 24160 1726853552.98614: variable 'network_provider' from source: set_fact 24160 1726853552.98625: variable 'ansible_facts' from source: unknown 24160 1726853552.99476: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 24160 1726853552.99479: when evaluation is False, skipping this task 24160 1726853552.99481: _execute() done 24160 1726853552.99484: dumping result to json 24160 1726853552.99485: done dumping result, returning 24160 1726853552.99491: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [02083763-bbaf-5676-4eb4-00000000008c] 24160 1726853552.99494: sending task result for task 02083763-bbaf-5676-4eb4-00000000008c 24160 1726853552.99556: done sending task result for task 02083763-bbaf-5676-4eb4-00000000008c 24160 1726853552.99558: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 24160 1726853552.99642: no more pending results, returning what we have 24160 1726853552.99646: results queue empty 24160 1726853552.99647: checking for any_errors_fatal 24160 1726853552.99652: done checking for any_errors_fatal 24160 1726853552.99653: checking for max_fail_percentage 24160 1726853552.99654: done checking for max_fail_percentage 24160 1726853552.99655: checking to see if all hosts have failed and the running result is not ok 24160 1726853552.99656: done checking to see if all hosts have failed 24160 1726853552.99656: getting the remaining hosts for this loop 24160 1726853552.99657: done getting the remaining hosts for this loop 24160 1726853552.99661: getting the next task for host managed_node1 24160 1726853552.99667: done getting next task for host managed_node1 24160 1726853552.99672: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 24160 1726853552.99674: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853552.99687: getting variables 24160 1726853552.99688: in VariableManager get_vars() 24160 1726853552.99816: Calling all_inventory to load vars for managed_node1 24160 1726853552.99819: Calling groups_inventory to load vars for managed_node1 24160 1726853552.99821: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853552.99833: Calling all_plugins_play to load vars for managed_node1 24160 1726853552.99835: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853552.99837: Calling groups_plugins_play to load vars for managed_node1 24160 1726853553.01071: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853553.02082: done with get_vars() 24160 1726853553.02098: done getting variables 24160 1726853553.02141: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 13:32:33 -0400 (0:00:00.128) 0:00:29.424 ****** 24160 1726853553.02164: entering _queue_task() for managed_node1/package 24160 1726853553.02396: worker is 1 (out of 1 available) 24160 1726853553.02408: exiting _queue_task() for managed_node1/package 24160 1726853553.02421: done queuing things up, now waiting for results queue to drain 24160 1726853553.02423: waiting for pending results... 24160 1726853553.02603: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 24160 1726853553.02680: in run() - task 02083763-bbaf-5676-4eb4-00000000008d 24160 1726853553.02693: variable 'ansible_search_path' from source: unknown 24160 1726853553.02696: variable 'ansible_search_path' from source: unknown 24160 1726853553.02724: calling self._execute() 24160 1726853553.02806: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853553.02811: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853553.02819: variable 'omit' from source: magic vars 24160 1726853553.03183: variable 'ansible_distribution_major_version' from source: facts 24160 1726853553.03376: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853553.03381: variable 'network_state' from source: role '' defaults 24160 1726853553.03384: Evaluated conditional (network_state != {}): False 24160 1726853553.03386: when evaluation is False, skipping this task 24160 1726853553.03388: _execute() done 24160 1726853553.03391: dumping result to json 24160 1726853553.03393: done dumping result, returning 24160 1726853553.03396: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [02083763-bbaf-5676-4eb4-00000000008d] 24160 1726853553.03398: sending task result for task 02083763-bbaf-5676-4eb4-00000000008d 24160 1726853553.03466: done sending task result for task 02083763-bbaf-5676-4eb4-00000000008d skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 24160 1726853553.03516: no more pending results, returning what we have 24160 1726853553.03520: results queue empty 24160 1726853553.03521: checking for any_errors_fatal 24160 1726853553.03529: done checking for any_errors_fatal 24160 1726853553.03530: checking for max_fail_percentage 24160 1726853553.03531: done checking for max_fail_percentage 24160 1726853553.03532: checking to see if all hosts have failed and the running result is not ok 24160 1726853553.03533: done checking to see if all hosts have failed 24160 1726853553.03534: getting the remaining hosts for this loop 24160 1726853553.03535: done getting the remaining hosts for this loop 24160 1726853553.03539: getting the next task for host managed_node1 24160 1726853553.03545: done getting next task for host managed_node1 24160 1726853553.03548: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 24160 1726853553.03551: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853553.03566: getting variables 24160 1726853553.03567: in VariableManager get_vars() 24160 1726853553.03604: Calling all_inventory to load vars for managed_node1 24160 1726853553.03607: Calling groups_inventory to load vars for managed_node1 24160 1726853553.03609: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853553.03620: Calling all_plugins_play to load vars for managed_node1 24160 1726853553.03623: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853553.03625: Calling groups_plugins_play to load vars for managed_node1 24160 1726853553.04404: WORKER PROCESS EXITING 24160 1726853553.04702: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853553.05584: done with get_vars() 24160 1726853553.05599: done getting variables 24160 1726853553.05642: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 13:32:33 -0400 (0:00:00.034) 0:00:29.459 ****** 24160 1726853553.05664: entering _queue_task() for managed_node1/package 24160 1726853553.05889: worker is 1 (out of 1 available) 24160 1726853553.05902: exiting _queue_task() for managed_node1/package 24160 1726853553.05914: done queuing things up, now waiting for results queue to drain 24160 1726853553.05916: waiting for pending results... 24160 1726853553.06087: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 24160 1726853553.06166: in run() - task 02083763-bbaf-5676-4eb4-00000000008e 24160 1726853553.06179: variable 'ansible_search_path' from source: unknown 24160 1726853553.06183: variable 'ansible_search_path' from source: unknown 24160 1726853553.06210: calling self._execute() 24160 1726853553.06285: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853553.06290: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853553.06299: variable 'omit' from source: magic vars 24160 1726853553.06574: variable 'ansible_distribution_major_version' from source: facts 24160 1726853553.06586: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853553.06666: variable 'network_state' from source: role '' defaults 24160 1726853553.06675: Evaluated conditional (network_state != {}): False 24160 1726853553.06678: when evaluation is False, skipping this task 24160 1726853553.06681: _execute() done 24160 1726853553.06686: dumping result to json 24160 1726853553.06688: done dumping result, returning 24160 1726853553.06699: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [02083763-bbaf-5676-4eb4-00000000008e] 24160 1726853553.06702: sending task result for task 02083763-bbaf-5676-4eb4-00000000008e 24160 1726853553.06790: done sending task result for task 02083763-bbaf-5676-4eb4-00000000008e 24160 1726853553.06792: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 24160 1726853553.06841: no more pending results, returning what we have 24160 1726853553.06845: results queue empty 24160 1726853553.06846: checking for any_errors_fatal 24160 1726853553.06852: done checking for any_errors_fatal 24160 1726853553.06853: checking for max_fail_percentage 24160 1726853553.06854: done checking for max_fail_percentage 24160 1726853553.06855: checking to see if all hosts have failed and the running result is not ok 24160 1726853553.06856: done checking to see if all hosts have failed 24160 1726853553.06857: getting the remaining hosts for this loop 24160 1726853553.06858: done getting the remaining hosts for this loop 24160 1726853553.06861: getting the next task for host managed_node1 24160 1726853553.06866: done getting next task for host managed_node1 24160 1726853553.06869: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 24160 1726853553.06873: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853553.06887: getting variables 24160 1726853553.06888: in VariableManager get_vars() 24160 1726853553.06916: Calling all_inventory to load vars for managed_node1 24160 1726853553.06918: Calling groups_inventory to load vars for managed_node1 24160 1726853553.06920: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853553.06929: Calling all_plugins_play to load vars for managed_node1 24160 1726853553.06931: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853553.06933: Calling groups_plugins_play to load vars for managed_node1 24160 1726853553.07812: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853553.08680: done with get_vars() 24160 1726853553.08694: done getting variables 24160 1726853553.08736: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 13:32:33 -0400 (0:00:00.030) 0:00:29.490 ****** 24160 1726853553.08756: entering _queue_task() for managed_node1/service 24160 1726853553.08953: worker is 1 (out of 1 available) 24160 1726853553.08967: exiting _queue_task() for managed_node1/service 24160 1726853553.08981: done queuing things up, now waiting for results queue to drain 24160 1726853553.08983: waiting for pending results... 24160 1726853553.09147: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 24160 1726853553.09219: in run() - task 02083763-bbaf-5676-4eb4-00000000008f 24160 1726853553.09231: variable 'ansible_search_path' from source: unknown 24160 1726853553.09234: variable 'ansible_search_path' from source: unknown 24160 1726853553.09262: calling self._execute() 24160 1726853553.09337: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853553.09341: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853553.09349: variable 'omit' from source: magic vars 24160 1726853553.09622: variable 'ansible_distribution_major_version' from source: facts 24160 1726853553.09632: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853553.09719: variable '__network_wireless_connections_defined' from source: role '' defaults 24160 1726853553.09846: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24160 1726853553.11309: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24160 1726853553.11361: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24160 1726853553.11420: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24160 1726853553.11433: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24160 1726853553.11453: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24160 1726853553.11514: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853553.11534: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853553.11550: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853553.11579: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853553.11591: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853553.11625: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853553.11642: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853553.11657: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853553.11689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853553.11699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853553.11730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853553.11746: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853553.11764: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853553.11790: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853553.11802: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853553.11917: variable 'network_connections' from source: play vars 24160 1726853553.11929: variable 'profile' from source: play vars 24160 1726853553.11982: variable 'profile' from source: play vars 24160 1726853553.11986: variable 'interface' from source: set_fact 24160 1726853553.12030: variable 'interface' from source: set_fact 24160 1726853553.12083: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24160 1726853553.12207: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24160 1726853553.12234: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24160 1726853553.12264: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24160 1726853553.12283: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24160 1726853553.12312: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24160 1726853553.12328: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24160 1726853553.12345: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853553.12366: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24160 1726853553.12406: variable '__network_team_connections_defined' from source: role '' defaults 24160 1726853553.12551: variable 'network_connections' from source: play vars 24160 1726853553.12558: variable 'profile' from source: play vars 24160 1726853553.12602: variable 'profile' from source: play vars 24160 1726853553.12605: variable 'interface' from source: set_fact 24160 1726853553.12647: variable 'interface' from source: set_fact 24160 1726853553.12666: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 24160 1726853553.12670: when evaluation is False, skipping this task 24160 1726853553.12674: _execute() done 24160 1726853553.12677: dumping result to json 24160 1726853553.12680: done dumping result, returning 24160 1726853553.12693: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [02083763-bbaf-5676-4eb4-00000000008f] 24160 1726853553.12705: sending task result for task 02083763-bbaf-5676-4eb4-00000000008f 24160 1726853553.12777: done sending task result for task 02083763-bbaf-5676-4eb4-00000000008f 24160 1726853553.12780: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 24160 1726853553.12832: no more pending results, returning what we have 24160 1726853553.12835: results queue empty 24160 1726853553.12836: checking for any_errors_fatal 24160 1726853553.12843: done checking for any_errors_fatal 24160 1726853553.12844: checking for max_fail_percentage 24160 1726853553.12845: done checking for max_fail_percentage 24160 1726853553.12846: checking to see if all hosts have failed and the running result is not ok 24160 1726853553.12847: done checking to see if all hosts have failed 24160 1726853553.12848: getting the remaining hosts for this loop 24160 1726853553.12849: done getting the remaining hosts for this loop 24160 1726853553.12853: getting the next task for host managed_node1 24160 1726853553.12861: done getting next task for host managed_node1 24160 1726853553.12865: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 24160 1726853553.12867: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853553.12881: getting variables 24160 1726853553.12882: in VariableManager get_vars() 24160 1726853553.12917: Calling all_inventory to load vars for managed_node1 24160 1726853553.12919: Calling groups_inventory to load vars for managed_node1 24160 1726853553.12922: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853553.12930: Calling all_plugins_play to load vars for managed_node1 24160 1726853553.12933: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853553.12935: Calling groups_plugins_play to load vars for managed_node1 24160 1726853553.13749: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853553.15050: done with get_vars() 24160 1726853553.15076: done getting variables 24160 1726853553.15136: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 13:32:33 -0400 (0:00:00.064) 0:00:29.554 ****** 24160 1726853553.15173: entering _queue_task() for managed_node1/service 24160 1726853553.15412: worker is 1 (out of 1 available) 24160 1726853553.15426: exiting _queue_task() for managed_node1/service 24160 1726853553.15439: done queuing things up, now waiting for results queue to drain 24160 1726853553.15441: waiting for pending results... 24160 1726853553.15617: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 24160 1726853553.15693: in run() - task 02083763-bbaf-5676-4eb4-000000000090 24160 1726853553.15706: variable 'ansible_search_path' from source: unknown 24160 1726853553.15709: variable 'ansible_search_path' from source: unknown 24160 1726853553.15749: calling self._execute() 24160 1726853553.15836: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853553.15841: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853553.15848: variable 'omit' from source: magic vars 24160 1726853553.16136: variable 'ansible_distribution_major_version' from source: facts 24160 1726853553.16147: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853553.16259: variable 'network_provider' from source: set_fact 24160 1726853553.16263: variable 'network_state' from source: role '' defaults 24160 1726853553.16272: Evaluated conditional (network_provider == "nm" or network_state != {}): True 24160 1726853553.16278: variable 'omit' from source: magic vars 24160 1726853553.16305: variable 'omit' from source: magic vars 24160 1726853553.16329: variable 'network_service_name' from source: role '' defaults 24160 1726853553.16380: variable 'network_service_name' from source: role '' defaults 24160 1726853553.16448: variable '__network_provider_setup' from source: role '' defaults 24160 1726853553.16453: variable '__network_service_name_default_nm' from source: role '' defaults 24160 1726853553.16500: variable '__network_service_name_default_nm' from source: role '' defaults 24160 1726853553.16507: variable '__network_packages_default_nm' from source: role '' defaults 24160 1726853553.16556: variable '__network_packages_default_nm' from source: role '' defaults 24160 1726853553.16699: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24160 1726853553.18363: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24160 1726853553.18409: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24160 1726853553.18445: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24160 1726853553.18470: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24160 1726853553.18496: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24160 1726853553.18549: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853553.18570: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853553.18589: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853553.18618: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853553.18628: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853553.18661: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853553.18678: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853553.18695: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853553.18722: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853553.18733: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853553.18876: variable '__network_packages_default_gobject_packages' from source: role '' defaults 24160 1726853553.18951: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853553.18969: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853553.18987: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853553.19010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853553.19021: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853553.19085: variable 'ansible_python' from source: facts 24160 1726853553.19102: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 24160 1726853553.19160: variable '__network_wpa_supplicant_required' from source: role '' defaults 24160 1726853553.19211: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 24160 1726853553.19293: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853553.19310: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853553.19327: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853553.19350: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853553.19374: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853553.19480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853553.19494: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853553.19497: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853553.19535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853553.19554: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853553.19689: variable 'network_connections' from source: play vars 24160 1726853553.19702: variable 'profile' from source: play vars 24160 1726853553.19976: variable 'profile' from source: play vars 24160 1726853553.19980: variable 'interface' from source: set_fact 24160 1726853553.19982: variable 'interface' from source: set_fact 24160 1726853553.19984: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24160 1726853553.20134: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24160 1726853553.20187: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24160 1726853553.20233: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24160 1726853553.20280: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24160 1726853553.20341: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24160 1726853553.20378: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24160 1726853553.20416: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853553.20457: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24160 1726853553.20499: variable '__network_wireless_connections_defined' from source: role '' defaults 24160 1726853553.20765: variable 'network_connections' from source: play vars 24160 1726853553.20779: variable 'profile' from source: play vars 24160 1726853553.20851: variable 'profile' from source: play vars 24160 1726853553.20863: variable 'interface' from source: set_fact 24160 1726853553.20924: variable 'interface' from source: set_fact 24160 1726853553.20960: variable '__network_packages_default_wireless' from source: role '' defaults 24160 1726853553.21041: variable '__network_wireless_connections_defined' from source: role '' defaults 24160 1726853553.21324: variable 'network_connections' from source: play vars 24160 1726853553.21335: variable 'profile' from source: play vars 24160 1726853553.21416: variable 'profile' from source: play vars 24160 1726853553.21459: variable 'interface' from source: set_fact 24160 1726853553.21531: variable 'interface' from source: set_fact 24160 1726853553.21560: variable '__network_packages_default_team' from source: role '' defaults 24160 1726853553.21638: variable '__network_team_connections_defined' from source: role '' defaults 24160 1726853553.21915: variable 'network_connections' from source: play vars 24160 1726853553.21926: variable 'profile' from source: play vars 24160 1726853553.21996: variable 'profile' from source: play vars 24160 1726853553.22007: variable 'interface' from source: set_fact 24160 1726853553.22081: variable 'interface' from source: set_fact 24160 1726853553.22132: variable '__network_service_name_default_initscripts' from source: role '' defaults 24160 1726853553.22196: variable '__network_service_name_default_initscripts' from source: role '' defaults 24160 1726853553.22204: variable '__network_packages_default_initscripts' from source: role '' defaults 24160 1726853553.22250: variable '__network_packages_default_initscripts' from source: role '' defaults 24160 1726853553.22395: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 24160 1726853553.22699: variable 'network_connections' from source: play vars 24160 1726853553.22702: variable 'profile' from source: play vars 24160 1726853553.22749: variable 'profile' from source: play vars 24160 1726853553.22752: variable 'interface' from source: set_fact 24160 1726853553.22800: variable 'interface' from source: set_fact 24160 1726853553.22807: variable 'ansible_distribution' from source: facts 24160 1726853553.22810: variable '__network_rh_distros' from source: role '' defaults 24160 1726853553.22817: variable 'ansible_distribution_major_version' from source: facts 24160 1726853553.22827: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 24160 1726853553.22938: variable 'ansible_distribution' from source: facts 24160 1726853553.22942: variable '__network_rh_distros' from source: role '' defaults 24160 1726853553.22946: variable 'ansible_distribution_major_version' from source: facts 24160 1726853553.22960: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 24160 1726853553.23175: variable 'ansible_distribution' from source: facts 24160 1726853553.23179: variable '__network_rh_distros' from source: role '' defaults 24160 1726853553.23181: variable 'ansible_distribution_major_version' from source: facts 24160 1726853553.23183: variable 'network_provider' from source: set_fact 24160 1726853553.23185: variable 'omit' from source: magic vars 24160 1726853553.23186: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24160 1726853553.23213: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24160 1726853553.23237: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24160 1726853553.23258: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853553.23277: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853553.23309: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24160 1726853553.23318: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853553.23327: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853553.23431: Set connection var ansible_shell_executable to /bin/sh 24160 1726853553.23446: Set connection var ansible_pipelining to False 24160 1726853553.23453: Set connection var ansible_connection to ssh 24160 1726853553.23461: Set connection var ansible_shell_type to sh 24160 1726853553.23478: Set connection var ansible_module_compression to ZIP_DEFLATED 24160 1726853553.23492: Set connection var ansible_timeout to 10 24160 1726853553.23522: variable 'ansible_shell_executable' from source: unknown 24160 1726853553.23530: variable 'ansible_connection' from source: unknown 24160 1726853553.23537: variable 'ansible_module_compression' from source: unknown 24160 1726853553.23543: variable 'ansible_shell_type' from source: unknown 24160 1726853553.23548: variable 'ansible_shell_executable' from source: unknown 24160 1726853553.23555: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853553.23677: variable 'ansible_pipelining' from source: unknown 24160 1726853553.23680: variable 'ansible_timeout' from source: unknown 24160 1726853553.23682: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853553.23685: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 24160 1726853553.23690: variable 'omit' from source: magic vars 24160 1726853553.23701: starting attempt loop 24160 1726853553.23708: running the handler 24160 1726853553.23789: variable 'ansible_facts' from source: unknown 24160 1726853553.24692: _low_level_execute_command(): starting 24160 1726853553.24704: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24160 1726853553.25287: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853553.25303: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853553.25317: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853553.25365: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853553.25384: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853553.25429: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853553.27153: stdout chunk (state=3): >>>/root <<< 24160 1726853553.27309: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853553.27321: stderr chunk (state=3): >>><<< 24160 1726853553.27328: stdout chunk (state=3): >>><<< 24160 1726853553.27440: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853553.27446: _low_level_execute_command(): starting 24160 1726853553.27449: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853553.2735705-25585-10567760156075 `" && echo ansible-tmp-1726853553.2735705-25585-10567760156075="` echo /root/.ansible/tmp/ansible-tmp-1726853553.2735705-25585-10567760156075 `" ) && sleep 0' 24160 1726853553.28530: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853553.28584: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853553.28587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24160 1726853553.28589: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853553.28591: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration <<< 24160 1726853553.28595: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853553.28597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853553.28691: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853553.28695: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853553.28773: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853553.30853: stdout chunk (state=3): >>>ansible-tmp-1726853553.2735705-25585-10567760156075=/root/.ansible/tmp/ansible-tmp-1726853553.2735705-25585-10567760156075 <<< 24160 1726853553.30914: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853553.31187: stderr chunk (state=3): >>><<< 24160 1726853553.31191: stdout chunk (state=3): >>><<< 24160 1726853553.31194: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853553.2735705-25585-10567760156075=/root/.ansible/tmp/ansible-tmp-1726853553.2735705-25585-10567760156075 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853553.31197: variable 'ansible_module_compression' from source: unknown 24160 1726853553.31212: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24160jdl187cr/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 24160 1726853553.31436: variable 'ansible_facts' from source: unknown 24160 1726853553.31633: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853553.2735705-25585-10567760156075/AnsiballZ_systemd.py 24160 1726853553.31897: Sending initial data 24160 1726853553.31900: Sent initial data (155 bytes) 24160 1726853553.33140: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853553.33309: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853553.33389: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853553.33487: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853553.35196: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 debug2: Sending SSH2_FXP_REALPATH "." <<< 24160 1726853553.35313: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24160jdl187cr/tmp93_9396x /root/.ansible/tmp/ansible-tmp-1726853553.2735705-25585-10567760156075/AnsiballZ_systemd.py <<< 24160 1726853553.35316: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853553.2735705-25585-10567760156075/AnsiballZ_systemd.py" <<< 24160 1726853553.35346: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24160jdl187cr/tmp93_9396x" to remote "/root/.ansible/tmp/ansible-tmp-1726853553.2735705-25585-10567760156075/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853553.2735705-25585-10567760156075/AnsiballZ_systemd.py" <<< 24160 1726853553.38408: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853553.38645: stderr chunk (state=3): >>><<< 24160 1726853553.38649: stdout chunk (state=3): >>><<< 24160 1726853553.38652: done transferring module to remote 24160 1726853553.38654: _low_level_execute_command(): starting 24160 1726853553.38664: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853553.2735705-25585-10567760156075/ /root/.ansible/tmp/ansible-tmp-1726853553.2735705-25585-10567760156075/AnsiballZ_systemd.py && sleep 0' 24160 1726853553.39577: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853553.39725: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853553.39755: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853553.39827: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853553.41878: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853553.41883: stdout chunk (state=3): >>><<< 24160 1726853553.41885: stderr chunk (state=3): >>><<< 24160 1726853553.42006: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853553.42010: _low_level_execute_command(): starting 24160 1726853553.42012: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853553.2735705-25585-10567760156075/AnsiballZ_systemd.py && sleep 0' 24160 1726853553.42606: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853553.42621: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853553.42636: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853553.42652: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24160 1726853553.42674: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 24160 1726853553.42724: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853553.42787: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853553.42844: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853553.42887: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853553.72079: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ExecMainStartTimestampMonotonic": "13747067", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ExecMainHandoffTimestampMonotonic": "13825256", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "10682368", "MemoryPeak": "14561280", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3315335168", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "1109455000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpR<<< 24160 1726853553.72086: stdout chunk (state=3): >>>eceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service shutdown.target multi-user.target network.target cloud-init.service", "After": "cloud-<<< 24160 1726853553.72115: stdout chunk (state=3): >>>init-local.service systemd-journald.socket sysinit.target dbus.socket dbus-broker.service basic.target system.slice network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:27:21 EDT", "StateChangeTimestampMonotonic": "407641563", "InactiveExitTimestamp": "Fri 2024-09-20 13:20:47 EDT", "InactiveExitTimestampMonotonic": "13748890", "ActiveEnterTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ActiveEnterTimestampMonotonic": "14166608", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ConditionTimestampMonotonic": "13745559", "AssertTimestamp": "Fri 2024-09-20 13:20:47 EDT", "AssertTimestampMonotonic": "13745562", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "5f58decfa480494eac8aa3993b4c7ec8", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 24160 1726853553.73903: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 24160 1726853553.73927: stderr chunk (state=3): >>><<< 24160 1726853553.73930: stdout chunk (state=3): >>><<< 24160 1726853553.73944: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ExecMainStartTimestampMonotonic": "13747067", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ExecMainHandoffTimestampMonotonic": "13825256", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "10682368", "MemoryPeak": "14561280", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3315335168", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "1109455000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service shutdown.target multi-user.target network.target cloud-init.service", "After": "cloud-init-local.service systemd-journald.socket sysinit.target dbus.socket dbus-broker.service basic.target system.slice network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:27:21 EDT", "StateChangeTimestampMonotonic": "407641563", "InactiveExitTimestamp": "Fri 2024-09-20 13:20:47 EDT", "InactiveExitTimestampMonotonic": "13748890", "ActiveEnterTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ActiveEnterTimestampMonotonic": "14166608", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ConditionTimestampMonotonic": "13745559", "AssertTimestamp": "Fri 2024-09-20 13:20:47 EDT", "AssertTimestampMonotonic": "13745562", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "5f58decfa480494eac8aa3993b4c7ec8", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 24160 1726853553.74063: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853553.2735705-25585-10567760156075/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24160 1726853553.74080: _low_level_execute_command(): starting 24160 1726853553.74084: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853553.2735705-25585-10567760156075/ > /dev/null 2>&1 && sleep 0' 24160 1726853553.74504: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853553.74507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 24160 1726853553.74509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration <<< 24160 1726853553.74511: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24160 1726853553.74514: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853553.74562: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853553.74565: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853553.74608: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853553.76411: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853553.76422: stderr chunk (state=3): >>><<< 24160 1726853553.76427: stdout chunk (state=3): >>><<< 24160 1726853553.76440: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853553.76477: handler run complete 24160 1726853553.76496: attempt loop complete, returning result 24160 1726853553.76499: _execute() done 24160 1726853553.76501: dumping result to json 24160 1726853553.76513: done dumping result, returning 24160 1726853553.76521: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [02083763-bbaf-5676-4eb4-000000000090] 24160 1726853553.76530: sending task result for task 02083763-bbaf-5676-4eb4-000000000090 24160 1726853553.76774: done sending task result for task 02083763-bbaf-5676-4eb4-000000000090 24160 1726853553.76777: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 24160 1726853553.76821: no more pending results, returning what we have 24160 1726853553.76824: results queue empty 24160 1726853553.76825: checking for any_errors_fatal 24160 1726853553.76831: done checking for any_errors_fatal 24160 1726853553.76831: checking for max_fail_percentage 24160 1726853553.76833: done checking for max_fail_percentage 24160 1726853553.76834: checking to see if all hosts have failed and the running result is not ok 24160 1726853553.76834: done checking to see if all hosts have failed 24160 1726853553.76835: getting the remaining hosts for this loop 24160 1726853553.76836: done getting the remaining hosts for this loop 24160 1726853553.76839: getting the next task for host managed_node1 24160 1726853553.76845: done getting next task for host managed_node1 24160 1726853553.76848: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 24160 1726853553.76850: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853553.76858: getting variables 24160 1726853553.76860: in VariableManager get_vars() 24160 1726853553.76893: Calling all_inventory to load vars for managed_node1 24160 1726853553.76896: Calling groups_inventory to load vars for managed_node1 24160 1726853553.76898: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853553.76906: Calling all_plugins_play to load vars for managed_node1 24160 1726853553.76909: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853553.76911: Calling groups_plugins_play to load vars for managed_node1 24160 1726853553.77952: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853553.79402: done with get_vars() 24160 1726853553.79423: done getting variables 24160 1726853553.79491: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 13:32:33 -0400 (0:00:00.643) 0:00:30.197 ****** 24160 1726853553.79524: entering _queue_task() for managed_node1/service 24160 1726853553.79781: worker is 1 (out of 1 available) 24160 1726853553.79794: exiting _queue_task() for managed_node1/service 24160 1726853553.79806: done queuing things up, now waiting for results queue to drain 24160 1726853553.79808: waiting for pending results... 24160 1726853553.79980: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 24160 1726853553.80060: in run() - task 02083763-bbaf-5676-4eb4-000000000091 24160 1726853553.80072: variable 'ansible_search_path' from source: unknown 24160 1726853553.80075: variable 'ansible_search_path' from source: unknown 24160 1726853553.80103: calling self._execute() 24160 1726853553.80180: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853553.80184: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853553.80193: variable 'omit' from source: magic vars 24160 1726853553.80481: variable 'ansible_distribution_major_version' from source: facts 24160 1726853553.80485: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853553.80560: variable 'network_provider' from source: set_fact 24160 1726853553.80564: Evaluated conditional (network_provider == "nm"): True 24160 1726853553.80629: variable '__network_wpa_supplicant_required' from source: role '' defaults 24160 1726853553.80691: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 24160 1726853553.80810: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24160 1726853553.82652: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24160 1726853553.82697: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24160 1726853553.82723: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24160 1726853553.82763: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24160 1726853553.82780: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24160 1726853553.82844: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853553.82873: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853553.82888: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853553.82914: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853553.82924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853553.82960: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853553.82980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853553.82996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853553.83021: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853553.83032: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853553.83060: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853553.83079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853553.83100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853553.83124: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853553.83135: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853553.83234: variable 'network_connections' from source: play vars 24160 1726853553.83244: variable 'profile' from source: play vars 24160 1726853553.83296: variable 'profile' from source: play vars 24160 1726853553.83308: variable 'interface' from source: set_fact 24160 1726853553.83345: variable 'interface' from source: set_fact 24160 1726853553.83396: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24160 1726853553.83506: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24160 1726853553.83536: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24160 1726853553.83558: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24160 1726853553.83585: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24160 1726853553.83613: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24160 1726853553.83633: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24160 1726853553.83649: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853553.83667: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24160 1726853553.83706: variable '__network_wireless_connections_defined' from source: role '' defaults 24160 1726853553.83869: variable 'network_connections' from source: play vars 24160 1726853553.83874: variable 'profile' from source: play vars 24160 1726853553.83916: variable 'profile' from source: play vars 24160 1726853553.83920: variable 'interface' from source: set_fact 24160 1726853553.83960: variable 'interface' from source: set_fact 24160 1726853553.83988: Evaluated conditional (__network_wpa_supplicant_required): False 24160 1726853553.83992: when evaluation is False, skipping this task 24160 1726853553.83994: _execute() done 24160 1726853553.84006: dumping result to json 24160 1726853553.84008: done dumping result, returning 24160 1726853553.84011: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [02083763-bbaf-5676-4eb4-000000000091] 24160 1726853553.84013: sending task result for task 02083763-bbaf-5676-4eb4-000000000091 24160 1726853553.84098: done sending task result for task 02083763-bbaf-5676-4eb4-000000000091 24160 1726853553.84100: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 24160 1726853553.84144: no more pending results, returning what we have 24160 1726853553.84147: results queue empty 24160 1726853553.84148: checking for any_errors_fatal 24160 1726853553.84169: done checking for any_errors_fatal 24160 1726853553.84170: checking for max_fail_percentage 24160 1726853553.84174: done checking for max_fail_percentage 24160 1726853553.84175: checking to see if all hosts have failed and the running result is not ok 24160 1726853553.84176: done checking to see if all hosts have failed 24160 1726853553.84177: getting the remaining hosts for this loop 24160 1726853553.84178: done getting the remaining hosts for this loop 24160 1726853553.84182: getting the next task for host managed_node1 24160 1726853553.84188: done getting next task for host managed_node1 24160 1726853553.84192: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 24160 1726853553.84194: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853553.84206: getting variables 24160 1726853553.84207: in VariableManager get_vars() 24160 1726853553.84243: Calling all_inventory to load vars for managed_node1 24160 1726853553.84245: Calling groups_inventory to load vars for managed_node1 24160 1726853553.84247: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853553.84257: Calling all_plugins_play to load vars for managed_node1 24160 1726853553.84260: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853553.84262: Calling groups_plugins_play to load vars for managed_node1 24160 1726853553.85062: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853553.86039: done with get_vars() 24160 1726853553.86054: done getting variables 24160 1726853553.86097: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 13:32:33 -0400 (0:00:00.065) 0:00:30.263 ****** 24160 1726853553.86119: entering _queue_task() for managed_node1/service 24160 1726853553.86347: worker is 1 (out of 1 available) 24160 1726853553.86361: exiting _queue_task() for managed_node1/service 24160 1726853553.86376: done queuing things up, now waiting for results queue to drain 24160 1726853553.86377: waiting for pending results... 24160 1726853553.86551: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 24160 1726853553.86625: in run() - task 02083763-bbaf-5676-4eb4-000000000092 24160 1726853553.86638: variable 'ansible_search_path' from source: unknown 24160 1726853553.86642: variable 'ansible_search_path' from source: unknown 24160 1726853553.86674: calling self._execute() 24160 1726853553.86743: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853553.86747: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853553.86756: variable 'omit' from source: magic vars 24160 1726853553.87036: variable 'ansible_distribution_major_version' from source: facts 24160 1726853553.87047: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853553.87126: variable 'network_provider' from source: set_fact 24160 1726853553.87130: Evaluated conditional (network_provider == "initscripts"): False 24160 1726853553.87133: when evaluation is False, skipping this task 24160 1726853553.87136: _execute() done 24160 1726853553.87138: dumping result to json 24160 1726853553.87142: done dumping result, returning 24160 1726853553.87155: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [02083763-bbaf-5676-4eb4-000000000092] 24160 1726853553.87158: sending task result for task 02083763-bbaf-5676-4eb4-000000000092 24160 1726853553.87242: done sending task result for task 02083763-bbaf-5676-4eb4-000000000092 24160 1726853553.87244: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 24160 1726853553.87309: no more pending results, returning what we have 24160 1726853553.87312: results queue empty 24160 1726853553.87313: checking for any_errors_fatal 24160 1726853553.87321: done checking for any_errors_fatal 24160 1726853553.87321: checking for max_fail_percentage 24160 1726853553.87323: done checking for max_fail_percentage 24160 1726853553.87323: checking to see if all hosts have failed and the running result is not ok 24160 1726853553.87324: done checking to see if all hosts have failed 24160 1726853553.87325: getting the remaining hosts for this loop 24160 1726853553.87326: done getting the remaining hosts for this loop 24160 1726853553.87329: getting the next task for host managed_node1 24160 1726853553.87334: done getting next task for host managed_node1 24160 1726853553.87337: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 24160 1726853553.87340: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853553.87351: getting variables 24160 1726853553.87352: in VariableManager get_vars() 24160 1726853553.87389: Calling all_inventory to load vars for managed_node1 24160 1726853553.87392: Calling groups_inventory to load vars for managed_node1 24160 1726853553.87394: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853553.87402: Calling all_plugins_play to load vars for managed_node1 24160 1726853553.87404: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853553.87406: Calling groups_plugins_play to load vars for managed_node1 24160 1726853553.88133: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853553.88996: done with get_vars() 24160 1726853553.89010: done getting variables 24160 1726853553.89050: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 13:32:33 -0400 (0:00:00.029) 0:00:30.293 ****** 24160 1726853553.89074: entering _queue_task() for managed_node1/copy 24160 1726853553.89276: worker is 1 (out of 1 available) 24160 1726853553.89290: exiting _queue_task() for managed_node1/copy 24160 1726853553.89300: done queuing things up, now waiting for results queue to drain 24160 1726853553.89302: waiting for pending results... 24160 1726853553.89474: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 24160 1726853553.89542: in run() - task 02083763-bbaf-5676-4eb4-000000000093 24160 1726853553.89549: variable 'ansible_search_path' from source: unknown 24160 1726853553.89552: variable 'ansible_search_path' from source: unknown 24160 1726853553.89588: calling self._execute() 24160 1726853553.89666: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853553.89672: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853553.89681: variable 'omit' from source: magic vars 24160 1726853553.89980: variable 'ansible_distribution_major_version' from source: facts 24160 1726853553.89990: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853553.90064: variable 'network_provider' from source: set_fact 24160 1726853553.90068: Evaluated conditional (network_provider == "initscripts"): False 24160 1726853553.90073: when evaluation is False, skipping this task 24160 1726853553.90076: _execute() done 24160 1726853553.90078: dumping result to json 24160 1726853553.90083: done dumping result, returning 24160 1726853553.90092: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [02083763-bbaf-5676-4eb4-000000000093] 24160 1726853553.90096: sending task result for task 02083763-bbaf-5676-4eb4-000000000093 24160 1726853553.90184: done sending task result for task 02083763-bbaf-5676-4eb4-000000000093 24160 1726853553.90187: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 24160 1726853553.90237: no more pending results, returning what we have 24160 1726853553.90240: results queue empty 24160 1726853553.90241: checking for any_errors_fatal 24160 1726853553.90246: done checking for any_errors_fatal 24160 1726853553.90247: checking for max_fail_percentage 24160 1726853553.90248: done checking for max_fail_percentage 24160 1726853553.90249: checking to see if all hosts have failed and the running result is not ok 24160 1726853553.90250: done checking to see if all hosts have failed 24160 1726853553.90250: getting the remaining hosts for this loop 24160 1726853553.90252: done getting the remaining hosts for this loop 24160 1726853553.90255: getting the next task for host managed_node1 24160 1726853553.90260: done getting next task for host managed_node1 24160 1726853553.90263: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 24160 1726853553.90265: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853553.90279: getting variables 24160 1726853553.90280: in VariableManager get_vars() 24160 1726853553.90309: Calling all_inventory to load vars for managed_node1 24160 1726853553.90311: Calling groups_inventory to load vars for managed_node1 24160 1726853553.90313: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853553.90320: Calling all_plugins_play to load vars for managed_node1 24160 1726853553.90323: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853553.90325: Calling groups_plugins_play to load vars for managed_node1 24160 1726853553.91174: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853553.92519: done with get_vars() 24160 1726853553.92534: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 13:32:33 -0400 (0:00:00.035) 0:00:30.328 ****** 24160 1726853553.92600: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 24160 1726853553.92807: worker is 1 (out of 1 available) 24160 1726853553.92819: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 24160 1726853553.92832: done queuing things up, now waiting for results queue to drain 24160 1726853553.92835: waiting for pending results... 24160 1726853553.93007: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 24160 1726853553.93074: in run() - task 02083763-bbaf-5676-4eb4-000000000094 24160 1726853553.93091: variable 'ansible_search_path' from source: unknown 24160 1726853553.93097: variable 'ansible_search_path' from source: unknown 24160 1726853553.93120: calling self._execute() 24160 1726853553.93199: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853553.93203: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853553.93211: variable 'omit' from source: magic vars 24160 1726853553.93484: variable 'ansible_distribution_major_version' from source: facts 24160 1726853553.93495: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853553.93501: variable 'omit' from source: magic vars 24160 1726853553.93533: variable 'omit' from source: magic vars 24160 1726853553.93644: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24160 1726853553.95776: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24160 1726853553.95780: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24160 1726853553.95782: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24160 1726853553.95784: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24160 1726853553.95786: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24160 1726853553.95825: variable 'network_provider' from source: set_fact 24160 1726853553.96066: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24160 1726853553.96120: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24160 1726853553.96160: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24160 1726853553.96211: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24160 1726853553.96234: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24160 1726853553.96325: variable 'omit' from source: magic vars 24160 1726853553.96452: variable 'omit' from source: magic vars 24160 1726853553.96577: variable 'network_connections' from source: play vars 24160 1726853553.96711: variable 'profile' from source: play vars 24160 1726853553.96715: variable 'profile' from source: play vars 24160 1726853553.96718: variable 'interface' from source: set_fact 24160 1726853553.96758: variable 'interface' from source: set_fact 24160 1726853553.96910: variable 'omit' from source: magic vars 24160 1726853553.96933: variable '__lsr_ansible_managed' from source: task vars 24160 1726853553.96998: variable '__lsr_ansible_managed' from source: task vars 24160 1726853553.97296: Loaded config def from plugin (lookup/template) 24160 1726853553.97307: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 24160 1726853553.97337: File lookup term: get_ansible_managed.j2 24160 1726853553.97346: variable 'ansible_search_path' from source: unknown 24160 1726853553.97367: evaluation_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 24160 1726853553.97386: search_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 24160 1726853553.97408: variable 'ansible_search_path' from source: unknown 24160 1726853554.06035: variable 'ansible_managed' from source: unknown 24160 1726853554.06112: variable 'omit' from source: magic vars 24160 1726853554.06130: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24160 1726853554.06147: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24160 1726853554.06178: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24160 1726853554.06181: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853554.06183: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853554.06190: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24160 1726853554.06193: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853554.06197: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853554.06252: Set connection var ansible_shell_executable to /bin/sh 24160 1726853554.06259: Set connection var ansible_pipelining to False 24160 1726853554.06262: Set connection var ansible_connection to ssh 24160 1726853554.06264: Set connection var ansible_shell_type to sh 24160 1726853554.06273: Set connection var ansible_module_compression to ZIP_DEFLATED 24160 1726853554.06280: Set connection var ansible_timeout to 10 24160 1726853554.06297: variable 'ansible_shell_executable' from source: unknown 24160 1726853554.06300: variable 'ansible_connection' from source: unknown 24160 1726853554.06302: variable 'ansible_module_compression' from source: unknown 24160 1726853554.06304: variable 'ansible_shell_type' from source: unknown 24160 1726853554.06307: variable 'ansible_shell_executable' from source: unknown 24160 1726853554.06309: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853554.06313: variable 'ansible_pipelining' from source: unknown 24160 1726853554.06315: variable 'ansible_timeout' from source: unknown 24160 1726853554.06319: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853554.06403: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 24160 1726853554.06414: variable 'omit' from source: magic vars 24160 1726853554.06416: starting attempt loop 24160 1726853554.06419: running the handler 24160 1726853554.06426: _low_level_execute_command(): starting 24160 1726853554.06431: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24160 1726853554.06899: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853554.06903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853554.06906: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853554.06958: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853554.06961: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853554.06963: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853554.07004: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853554.08706: stdout chunk (state=3): >>>/root <<< 24160 1726853554.08823: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853554.08826: stderr chunk (state=3): >>><<< 24160 1726853554.08829: stdout chunk (state=3): >>><<< 24160 1726853554.08839: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853554.08878: _low_level_execute_command(): starting 24160 1726853554.08881: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853554.0884552-25630-278181712471339 `" && echo ansible-tmp-1726853554.0884552-25630-278181712471339="` echo /root/.ansible/tmp/ansible-tmp-1726853554.0884552-25630-278181712471339 `" ) && sleep 0' 24160 1726853554.09281: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853554.09285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853554.09287: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24160 1726853554.09290: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853554.09336: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853554.09340: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853554.09383: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853554.11283: stdout chunk (state=3): >>>ansible-tmp-1726853554.0884552-25630-278181712471339=/root/.ansible/tmp/ansible-tmp-1726853554.0884552-25630-278181712471339 <<< 24160 1726853554.11388: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853554.11415: stderr chunk (state=3): >>><<< 24160 1726853554.11418: stdout chunk (state=3): >>><<< 24160 1726853554.11428: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853554.0884552-25630-278181712471339=/root/.ansible/tmp/ansible-tmp-1726853554.0884552-25630-278181712471339 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853554.11476: variable 'ansible_module_compression' from source: unknown 24160 1726853554.11498: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24160jdl187cr/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 24160 1726853554.11526: variable 'ansible_facts' from source: unknown 24160 1726853554.11610: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853554.0884552-25630-278181712471339/AnsiballZ_network_connections.py 24160 1726853554.11700: Sending initial data 24160 1726853554.11704: Sent initial data (168 bytes) 24160 1726853554.12110: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853554.12113: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24160 1726853554.12116: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address <<< 24160 1726853554.12123: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24160 1726853554.12125: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853554.12174: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853554.12179: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853554.12218: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853554.13757: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24160 1726853554.13796: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24160 1726853554.13836: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24160jdl187cr/tmps1_hu0aa /root/.ansible/tmp/ansible-tmp-1726853554.0884552-25630-278181712471339/AnsiballZ_network_connections.py <<< 24160 1726853554.13840: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853554.0884552-25630-278181712471339/AnsiballZ_network_connections.py" <<< 24160 1726853554.13875: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24160jdl187cr/tmps1_hu0aa" to remote "/root/.ansible/tmp/ansible-tmp-1726853554.0884552-25630-278181712471339/AnsiballZ_network_connections.py" <<< 24160 1726853554.13878: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853554.0884552-25630-278181712471339/AnsiballZ_network_connections.py" <<< 24160 1726853554.14569: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853554.14608: stderr chunk (state=3): >>><<< 24160 1726853554.14611: stdout chunk (state=3): >>><<< 24160 1726853554.14651: done transferring module to remote 24160 1726853554.14660: _low_level_execute_command(): starting 24160 1726853554.14663: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853554.0884552-25630-278181712471339/ /root/.ansible/tmp/ansible-tmp-1726853554.0884552-25630-278181712471339/AnsiballZ_network_connections.py && sleep 0' 24160 1726853554.15097: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853554.15100: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853554.15102: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853554.15104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853554.15152: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853554.15155: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853554.15201: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853554.16933: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853554.16955: stderr chunk (state=3): >>><<< 24160 1726853554.16958: stdout chunk (state=3): >>><<< 24160 1726853554.16974: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853554.16977: _low_level_execute_command(): starting 24160 1726853554.16982: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853554.0884552-25630-278181712471339/AnsiballZ_network_connections.py && sleep 0' 24160 1726853554.17376: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853554.17400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address <<< 24160 1726853554.17404: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 24160 1726853554.17407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853554.17458: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853554.17464: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853554.17504: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853554.44838: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_t2o5_kzb/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_t2o5_kzb/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on ethtest0/b1645e58-2740-4ae7-b45a-6a29b04ac1fe: error=unknown <<< 24160 1726853554.44970: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 24160 1726853554.46742: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 24160 1726853554.46773: stderr chunk (state=3): >>><<< 24160 1726853554.46777: stdout chunk (state=3): >>><<< 24160 1726853554.46796: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_t2o5_kzb/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_t2o5_kzb/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on ethtest0/b1645e58-2740-4ae7-b45a-6a29b04ac1fe: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 24160 1726853554.46822: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'ethtest0', 'persistent_state': 'absent'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853554.0884552-25630-278181712471339/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24160 1726853554.46829: _low_level_execute_command(): starting 24160 1726853554.46834: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853554.0884552-25630-278181712471339/ > /dev/null 2>&1 && sleep 0' 24160 1726853554.47267: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853554.47270: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 24160 1726853554.47274: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 24160 1726853554.47277: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853554.47325: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853554.47331: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853554.47376: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853554.49186: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853554.49209: stderr chunk (state=3): >>><<< 24160 1726853554.49213: stdout chunk (state=3): >>><<< 24160 1726853554.49225: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853554.49232: handler run complete 24160 1726853554.49252: attempt loop complete, returning result 24160 1726853554.49258: _execute() done 24160 1726853554.49260: dumping result to json 24160 1726853554.49262: done dumping result, returning 24160 1726853554.49268: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [02083763-bbaf-5676-4eb4-000000000094] 24160 1726853554.49273: sending task result for task 02083763-bbaf-5676-4eb4-000000000094 changed: [managed_node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 24160 1726853554.49463: no more pending results, returning what we have 24160 1726853554.49466: results queue empty 24160 1726853554.49467: checking for any_errors_fatal 24160 1726853554.49477: done checking for any_errors_fatal 24160 1726853554.49478: checking for max_fail_percentage 24160 1726853554.49479: done checking for max_fail_percentage 24160 1726853554.49480: checking to see if all hosts have failed and the running result is not ok 24160 1726853554.49481: done checking to see if all hosts have failed 24160 1726853554.49481: getting the remaining hosts for this loop 24160 1726853554.49489: done getting the remaining hosts for this loop 24160 1726853554.49493: getting the next task for host managed_node1 24160 1726853554.49498: done getting next task for host managed_node1 24160 1726853554.49502: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 24160 1726853554.49504: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853554.49511: done sending task result for task 02083763-bbaf-5676-4eb4-000000000094 24160 1726853554.49514: WORKER PROCESS EXITING 24160 1726853554.49520: getting variables 24160 1726853554.49521: in VariableManager get_vars() 24160 1726853554.49557: Calling all_inventory to load vars for managed_node1 24160 1726853554.49559: Calling groups_inventory to load vars for managed_node1 24160 1726853554.49562: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853554.49570: Calling all_plugins_play to load vars for managed_node1 24160 1726853554.49574: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853554.49577: Calling groups_plugins_play to load vars for managed_node1 24160 1726853554.50428: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853554.51404: done with get_vars() 24160 1726853554.51426: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 13:32:34 -0400 (0:00:00.589) 0:00:30.917 ****** 24160 1726853554.51520: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 24160 1726853554.51877: worker is 1 (out of 1 available) 24160 1726853554.51890: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 24160 1726853554.51904: done queuing things up, now waiting for results queue to drain 24160 1726853554.51906: waiting for pending results... 24160 1726853554.52290: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 24160 1726853554.52316: in run() - task 02083763-bbaf-5676-4eb4-000000000095 24160 1726853554.52335: variable 'ansible_search_path' from source: unknown 24160 1726853554.52342: variable 'ansible_search_path' from source: unknown 24160 1726853554.52391: calling self._execute() 24160 1726853554.52497: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853554.52509: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853554.52524: variable 'omit' from source: magic vars 24160 1726853554.52885: variable 'ansible_distribution_major_version' from source: facts 24160 1726853554.52895: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853554.52993: variable 'network_state' from source: role '' defaults 24160 1726853554.53003: Evaluated conditional (network_state != {}): False 24160 1726853554.53007: when evaluation is False, skipping this task 24160 1726853554.53010: _execute() done 24160 1726853554.53012: dumping result to json 24160 1726853554.53015: done dumping result, returning 24160 1726853554.53021: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [02083763-bbaf-5676-4eb4-000000000095] 24160 1726853554.53026: sending task result for task 02083763-bbaf-5676-4eb4-000000000095 24160 1726853554.53112: done sending task result for task 02083763-bbaf-5676-4eb4-000000000095 24160 1726853554.53115: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 24160 1726853554.53188: no more pending results, returning what we have 24160 1726853554.53192: results queue empty 24160 1726853554.53193: checking for any_errors_fatal 24160 1726853554.53204: done checking for any_errors_fatal 24160 1726853554.53205: checking for max_fail_percentage 24160 1726853554.53206: done checking for max_fail_percentage 24160 1726853554.53207: checking to see if all hosts have failed and the running result is not ok 24160 1726853554.53208: done checking to see if all hosts have failed 24160 1726853554.53208: getting the remaining hosts for this loop 24160 1726853554.53210: done getting the remaining hosts for this loop 24160 1726853554.53214: getting the next task for host managed_node1 24160 1726853554.53219: done getting next task for host managed_node1 24160 1726853554.53222: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 24160 1726853554.53225: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853554.53238: getting variables 24160 1726853554.53239: in VariableManager get_vars() 24160 1726853554.53269: Calling all_inventory to load vars for managed_node1 24160 1726853554.53273: Calling groups_inventory to load vars for managed_node1 24160 1726853554.53275: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853554.53283: Calling all_plugins_play to load vars for managed_node1 24160 1726853554.53286: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853554.53288: Calling groups_plugins_play to load vars for managed_node1 24160 1726853554.54160: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853554.55319: done with get_vars() 24160 1726853554.55337: done getting variables 24160 1726853554.55392: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 13:32:34 -0400 (0:00:00.038) 0:00:30.956 ****** 24160 1726853554.55415: entering _queue_task() for managed_node1/debug 24160 1726853554.55688: worker is 1 (out of 1 available) 24160 1726853554.55699: exiting _queue_task() for managed_node1/debug 24160 1726853554.55713: done queuing things up, now waiting for results queue to drain 24160 1726853554.55714: waiting for pending results... 24160 1726853554.56088: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 24160 1726853554.56131: in run() - task 02083763-bbaf-5676-4eb4-000000000096 24160 1726853554.56151: variable 'ansible_search_path' from source: unknown 24160 1726853554.56158: variable 'ansible_search_path' from source: unknown 24160 1726853554.56203: calling self._execute() 24160 1726853554.56306: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853554.56378: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853554.56383: variable 'omit' from source: magic vars 24160 1726853554.56723: variable 'ansible_distribution_major_version' from source: facts 24160 1726853554.56727: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853554.56754: variable 'omit' from source: magic vars 24160 1726853554.56784: variable 'omit' from source: magic vars 24160 1726853554.56809: variable 'omit' from source: magic vars 24160 1726853554.56840: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24160 1726853554.56876: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24160 1726853554.56887: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24160 1726853554.56900: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853554.56909: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853554.56934: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24160 1726853554.56943: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853554.56946: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853554.57024: Set connection var ansible_shell_executable to /bin/sh 24160 1726853554.57028: Set connection var ansible_pipelining to False 24160 1726853554.57032: Set connection var ansible_connection to ssh 24160 1726853554.57034: Set connection var ansible_shell_type to sh 24160 1726853554.57042: Set connection var ansible_module_compression to ZIP_DEFLATED 24160 1726853554.57049: Set connection var ansible_timeout to 10 24160 1726853554.57067: variable 'ansible_shell_executable' from source: unknown 24160 1726853554.57072: variable 'ansible_connection' from source: unknown 24160 1726853554.57076: variable 'ansible_module_compression' from source: unknown 24160 1726853554.57078: variable 'ansible_shell_type' from source: unknown 24160 1726853554.57082: variable 'ansible_shell_executable' from source: unknown 24160 1726853554.57084: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853554.57088: variable 'ansible_pipelining' from source: unknown 24160 1726853554.57091: variable 'ansible_timeout' from source: unknown 24160 1726853554.57093: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853554.57193: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 24160 1726853554.57211: variable 'omit' from source: magic vars 24160 1726853554.57214: starting attempt loop 24160 1726853554.57217: running the handler 24160 1726853554.57307: variable '__network_connections_result' from source: set_fact 24160 1726853554.57348: handler run complete 24160 1726853554.57364: attempt loop complete, returning result 24160 1726853554.57367: _execute() done 24160 1726853554.57370: dumping result to json 24160 1726853554.57374: done dumping result, returning 24160 1726853554.57382: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [02083763-bbaf-5676-4eb4-000000000096] 24160 1726853554.57387: sending task result for task 02083763-bbaf-5676-4eb4-000000000096 24160 1726853554.57462: done sending task result for task 02083763-bbaf-5676-4eb4-000000000096 24160 1726853554.57465: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result.stderr_lines": [ "" ] } 24160 1726853554.57519: no more pending results, returning what we have 24160 1726853554.57522: results queue empty 24160 1726853554.57523: checking for any_errors_fatal 24160 1726853554.57531: done checking for any_errors_fatal 24160 1726853554.57532: checking for max_fail_percentage 24160 1726853554.57533: done checking for max_fail_percentage 24160 1726853554.57534: checking to see if all hosts have failed and the running result is not ok 24160 1726853554.57535: done checking to see if all hosts have failed 24160 1726853554.57535: getting the remaining hosts for this loop 24160 1726853554.57537: done getting the remaining hosts for this loop 24160 1726853554.57540: getting the next task for host managed_node1 24160 1726853554.57546: done getting next task for host managed_node1 24160 1726853554.57549: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 24160 1726853554.57551: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853554.57560: getting variables 24160 1726853554.57561: in VariableManager get_vars() 24160 1726853554.57596: Calling all_inventory to load vars for managed_node1 24160 1726853554.57599: Calling groups_inventory to load vars for managed_node1 24160 1726853554.57601: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853554.57609: Calling all_plugins_play to load vars for managed_node1 24160 1726853554.57611: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853554.57614: Calling groups_plugins_play to load vars for managed_node1 24160 1726853554.58387: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853554.59791: done with get_vars() 24160 1726853554.59806: done getting variables 24160 1726853554.59846: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 13:32:34 -0400 (0:00:00.044) 0:00:31.001 ****** 24160 1726853554.59869: entering _queue_task() for managed_node1/debug 24160 1726853554.60088: worker is 1 (out of 1 available) 24160 1726853554.60102: exiting _queue_task() for managed_node1/debug 24160 1726853554.60115: done queuing things up, now waiting for results queue to drain 24160 1726853554.60116: waiting for pending results... 24160 1726853554.60288: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 24160 1726853554.60356: in run() - task 02083763-bbaf-5676-4eb4-000000000097 24160 1726853554.60374: variable 'ansible_search_path' from source: unknown 24160 1726853554.60378: variable 'ansible_search_path' from source: unknown 24160 1726853554.60403: calling self._execute() 24160 1726853554.60478: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853554.60483: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853554.60492: variable 'omit' from source: magic vars 24160 1726853554.60769: variable 'ansible_distribution_major_version' from source: facts 24160 1726853554.60782: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853554.60792: variable 'omit' from source: magic vars 24160 1726853554.60818: variable 'omit' from source: magic vars 24160 1726853554.60842: variable 'omit' from source: magic vars 24160 1726853554.60876: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24160 1726853554.60904: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24160 1726853554.60920: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24160 1726853554.60932: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853554.60942: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853554.60968: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24160 1726853554.60973: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853554.60975: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853554.61046: Set connection var ansible_shell_executable to /bin/sh 24160 1726853554.61050: Set connection var ansible_pipelining to False 24160 1726853554.61053: Set connection var ansible_connection to ssh 24160 1726853554.61059: Set connection var ansible_shell_type to sh 24160 1726853554.61066: Set connection var ansible_module_compression to ZIP_DEFLATED 24160 1726853554.61076: Set connection var ansible_timeout to 10 24160 1726853554.61092: variable 'ansible_shell_executable' from source: unknown 24160 1726853554.61095: variable 'ansible_connection' from source: unknown 24160 1726853554.61099: variable 'ansible_module_compression' from source: unknown 24160 1726853554.61101: variable 'ansible_shell_type' from source: unknown 24160 1726853554.61104: variable 'ansible_shell_executable' from source: unknown 24160 1726853554.61106: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853554.61109: variable 'ansible_pipelining' from source: unknown 24160 1726853554.61112: variable 'ansible_timeout' from source: unknown 24160 1726853554.61114: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853554.61215: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 24160 1726853554.61232: variable 'omit' from source: magic vars 24160 1726853554.61235: starting attempt loop 24160 1726853554.61238: running the handler 24160 1726853554.61276: variable '__network_connections_result' from source: set_fact 24160 1726853554.61328: variable '__network_connections_result' from source: set_fact 24160 1726853554.61404: handler run complete 24160 1726853554.61420: attempt loop complete, returning result 24160 1726853554.61423: _execute() done 24160 1726853554.61426: dumping result to json 24160 1726853554.61431: done dumping result, returning 24160 1726853554.61438: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [02083763-bbaf-5676-4eb4-000000000097] 24160 1726853554.61442: sending task result for task 02083763-bbaf-5676-4eb4-000000000097 24160 1726853554.61526: done sending task result for task 02083763-bbaf-5676-4eb4-000000000097 24160 1726853554.61529: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 24160 1726853554.61628: no more pending results, returning what we have 24160 1726853554.61631: results queue empty 24160 1726853554.61632: checking for any_errors_fatal 24160 1726853554.61636: done checking for any_errors_fatal 24160 1726853554.61637: checking for max_fail_percentage 24160 1726853554.61638: done checking for max_fail_percentage 24160 1726853554.61639: checking to see if all hosts have failed and the running result is not ok 24160 1726853554.61640: done checking to see if all hosts have failed 24160 1726853554.61640: getting the remaining hosts for this loop 24160 1726853554.61641: done getting the remaining hosts for this loop 24160 1726853554.61644: getting the next task for host managed_node1 24160 1726853554.61648: done getting next task for host managed_node1 24160 1726853554.61651: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 24160 1726853554.61653: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853554.61663: getting variables 24160 1726853554.61664: in VariableManager get_vars() 24160 1726853554.61695: Calling all_inventory to load vars for managed_node1 24160 1726853554.61697: Calling groups_inventory to load vars for managed_node1 24160 1726853554.61699: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853554.61706: Calling all_plugins_play to load vars for managed_node1 24160 1726853554.61709: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853554.61711: Calling groups_plugins_play to load vars for managed_node1 24160 1726853554.65792: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853554.66646: done with get_vars() 24160 1726853554.66662: done getting variables 24160 1726853554.66697: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 13:32:34 -0400 (0:00:00.068) 0:00:31.069 ****** 24160 1726853554.66722: entering _queue_task() for managed_node1/debug 24160 1726853554.66985: worker is 1 (out of 1 available) 24160 1726853554.66999: exiting _queue_task() for managed_node1/debug 24160 1726853554.67012: done queuing things up, now waiting for results queue to drain 24160 1726853554.67015: waiting for pending results... 24160 1726853554.67205: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 24160 1726853554.67286: in run() - task 02083763-bbaf-5676-4eb4-000000000098 24160 1726853554.67299: variable 'ansible_search_path' from source: unknown 24160 1726853554.67303: variable 'ansible_search_path' from source: unknown 24160 1726853554.67330: calling self._execute() 24160 1726853554.67411: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853554.67416: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853554.67424: variable 'omit' from source: magic vars 24160 1726853554.67718: variable 'ansible_distribution_major_version' from source: facts 24160 1726853554.67729: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853554.67814: variable 'network_state' from source: role '' defaults 24160 1726853554.67822: Evaluated conditional (network_state != {}): False 24160 1726853554.67827: when evaluation is False, skipping this task 24160 1726853554.67829: _execute() done 24160 1726853554.67833: dumping result to json 24160 1726853554.67836: done dumping result, returning 24160 1726853554.67843: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [02083763-bbaf-5676-4eb4-000000000098] 24160 1726853554.67847: sending task result for task 02083763-bbaf-5676-4eb4-000000000098 24160 1726853554.67934: done sending task result for task 02083763-bbaf-5676-4eb4-000000000098 24160 1726853554.67937: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "network_state != {}" } 24160 1726853554.67981: no more pending results, returning what we have 24160 1726853554.67984: results queue empty 24160 1726853554.67985: checking for any_errors_fatal 24160 1726853554.67996: done checking for any_errors_fatal 24160 1726853554.67997: checking for max_fail_percentage 24160 1726853554.67998: done checking for max_fail_percentage 24160 1726853554.67999: checking to see if all hosts have failed and the running result is not ok 24160 1726853554.68000: done checking to see if all hosts have failed 24160 1726853554.68000: getting the remaining hosts for this loop 24160 1726853554.68002: done getting the remaining hosts for this loop 24160 1726853554.68005: getting the next task for host managed_node1 24160 1726853554.68010: done getting next task for host managed_node1 24160 1726853554.68014: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 24160 1726853554.68017: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853554.68030: getting variables 24160 1726853554.68031: in VariableManager get_vars() 24160 1726853554.68066: Calling all_inventory to load vars for managed_node1 24160 1726853554.68069: Calling groups_inventory to load vars for managed_node1 24160 1726853554.68073: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853554.68083: Calling all_plugins_play to load vars for managed_node1 24160 1726853554.68086: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853554.68088: Calling groups_plugins_play to load vars for managed_node1 24160 1726853554.68929: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853554.69819: done with get_vars() 24160 1726853554.69835: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 13:32:34 -0400 (0:00:00.031) 0:00:31.101 ****** 24160 1726853554.69903: entering _queue_task() for managed_node1/ping 24160 1726853554.70133: worker is 1 (out of 1 available) 24160 1726853554.70147: exiting _queue_task() for managed_node1/ping 24160 1726853554.70159: done queuing things up, now waiting for results queue to drain 24160 1726853554.70161: waiting for pending results... 24160 1726853554.70342: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 24160 1726853554.70430: in run() - task 02083763-bbaf-5676-4eb4-000000000099 24160 1726853554.70442: variable 'ansible_search_path' from source: unknown 24160 1726853554.70445: variable 'ansible_search_path' from source: unknown 24160 1726853554.70476: calling self._execute() 24160 1726853554.70558: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853554.70561: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853554.70568: variable 'omit' from source: magic vars 24160 1726853554.70852: variable 'ansible_distribution_major_version' from source: facts 24160 1726853554.70863: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853554.70869: variable 'omit' from source: magic vars 24160 1726853554.70902: variable 'omit' from source: magic vars 24160 1726853554.70926: variable 'omit' from source: magic vars 24160 1726853554.70961: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24160 1726853554.70988: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24160 1726853554.71005: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24160 1726853554.71018: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853554.71027: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853554.71056: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24160 1726853554.71061: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853554.71063: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853554.71130: Set connection var ansible_shell_executable to /bin/sh 24160 1726853554.71134: Set connection var ansible_pipelining to False 24160 1726853554.71137: Set connection var ansible_connection to ssh 24160 1726853554.71140: Set connection var ansible_shell_type to sh 24160 1726853554.71152: Set connection var ansible_module_compression to ZIP_DEFLATED 24160 1726853554.71158: Set connection var ansible_timeout to 10 24160 1726853554.71174: variable 'ansible_shell_executable' from source: unknown 24160 1726853554.71178: variable 'ansible_connection' from source: unknown 24160 1726853554.71181: variable 'ansible_module_compression' from source: unknown 24160 1726853554.71183: variable 'ansible_shell_type' from source: unknown 24160 1726853554.71186: variable 'ansible_shell_executable' from source: unknown 24160 1726853554.71188: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853554.71190: variable 'ansible_pipelining' from source: unknown 24160 1726853554.71192: variable 'ansible_timeout' from source: unknown 24160 1726853554.71197: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853554.71341: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 24160 1726853554.71350: variable 'omit' from source: magic vars 24160 1726853554.71357: starting attempt loop 24160 1726853554.71360: running the handler 24160 1726853554.71372: _low_level_execute_command(): starting 24160 1726853554.71384: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24160 1726853554.71877: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853554.71894: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853554.71953: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853554.71960: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853554.71964: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853554.72008: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853554.73698: stdout chunk (state=3): >>>/root <<< 24160 1726853554.73798: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853554.73824: stderr chunk (state=3): >>><<< 24160 1726853554.73827: stdout chunk (state=3): >>><<< 24160 1726853554.73852: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853554.73864: _low_level_execute_command(): starting 24160 1726853554.73868: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853554.7385128-25669-58044007951042 `" && echo ansible-tmp-1726853554.7385128-25669-58044007951042="` echo /root/.ansible/tmp/ansible-tmp-1726853554.7385128-25669-58044007951042 `" ) && sleep 0' 24160 1726853554.74310: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853554.74313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853554.74316: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853554.74325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 24160 1726853554.74328: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853554.74367: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853554.74375: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853554.74378: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853554.74419: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853554.76295: stdout chunk (state=3): >>>ansible-tmp-1726853554.7385128-25669-58044007951042=/root/.ansible/tmp/ansible-tmp-1726853554.7385128-25669-58044007951042 <<< 24160 1726853554.76402: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853554.76427: stderr chunk (state=3): >>><<< 24160 1726853554.76430: stdout chunk (state=3): >>><<< 24160 1726853554.76444: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853554.7385128-25669-58044007951042=/root/.ansible/tmp/ansible-tmp-1726853554.7385128-25669-58044007951042 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853554.76485: variable 'ansible_module_compression' from source: unknown 24160 1726853554.76517: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24160jdl187cr/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 24160 1726853554.76550: variable 'ansible_facts' from source: unknown 24160 1726853554.76605: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853554.7385128-25669-58044007951042/AnsiballZ_ping.py 24160 1726853554.76703: Sending initial data 24160 1726853554.76706: Sent initial data (152 bytes) 24160 1726853554.77139: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853554.77142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853554.77145: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration <<< 24160 1726853554.77148: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24160 1726853554.77149: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853554.77190: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853554.77194: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853554.77203: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853554.77247: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853554.78770: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24160 1726853554.78807: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24160 1726853554.78846: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24160jdl187cr/tmps3rfmr9y /root/.ansible/tmp/ansible-tmp-1726853554.7385128-25669-58044007951042/AnsiballZ_ping.py <<< 24160 1726853554.78851: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853554.7385128-25669-58044007951042/AnsiballZ_ping.py" <<< 24160 1726853554.78888: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24160jdl187cr/tmps3rfmr9y" to remote "/root/.ansible/tmp/ansible-tmp-1726853554.7385128-25669-58044007951042/AnsiballZ_ping.py" <<< 24160 1726853554.78897: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853554.7385128-25669-58044007951042/AnsiballZ_ping.py" <<< 24160 1726853554.79402: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853554.79440: stderr chunk (state=3): >>><<< 24160 1726853554.79444: stdout chunk (state=3): >>><<< 24160 1726853554.79462: done transferring module to remote 24160 1726853554.79470: _low_level_execute_command(): starting 24160 1726853554.79475: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853554.7385128-25669-58044007951042/ /root/.ansible/tmp/ansible-tmp-1726853554.7385128-25669-58044007951042/AnsiballZ_ping.py && sleep 0' 24160 1726853554.80080: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853554.80084: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24160 1726853554.80176: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853554.80212: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853554.80215: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853554.80265: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853554.81975: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853554.82001: stderr chunk (state=3): >>><<< 24160 1726853554.82004: stdout chunk (state=3): >>><<< 24160 1726853554.82017: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853554.82020: _low_level_execute_command(): starting 24160 1726853554.82026: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853554.7385128-25669-58044007951042/AnsiballZ_ping.py && sleep 0' 24160 1726853554.82649: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853554.82667: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853554.82884: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853554.97609: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 24160 1726853554.98840: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853554.98844: stderr chunk (state=3): >>>Shared connection to 10.31.45.153 closed. <<< 24160 1726853554.98869: stderr chunk (state=3): >>><<< 24160 1726853554.98874: stdout chunk (state=3): >>><<< 24160 1726853554.98888: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 24160 1726853554.98909: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853554.7385128-25669-58044007951042/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24160 1726853554.98918: _low_level_execute_command(): starting 24160 1726853554.98922: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853554.7385128-25669-58044007951042/ > /dev/null 2>&1 && sleep 0' 24160 1726853554.99583: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853554.99633: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853554.99650: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853554.99688: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853554.99751: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853555.01576: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853555.01601: stderr chunk (state=3): >>><<< 24160 1726853555.01604: stdout chunk (state=3): >>><<< 24160 1726853555.01621: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853555.01630: handler run complete 24160 1726853555.01640: attempt loop complete, returning result 24160 1726853555.01643: _execute() done 24160 1726853555.01645: dumping result to json 24160 1726853555.01650: done dumping result, returning 24160 1726853555.01660: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [02083763-bbaf-5676-4eb4-000000000099] 24160 1726853555.01663: sending task result for task 02083763-bbaf-5676-4eb4-000000000099 24160 1726853555.01749: done sending task result for task 02083763-bbaf-5676-4eb4-000000000099 24160 1726853555.01752: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "ping": "pong" } 24160 1726853555.01808: no more pending results, returning what we have 24160 1726853555.01812: results queue empty 24160 1726853555.01813: checking for any_errors_fatal 24160 1726853555.01819: done checking for any_errors_fatal 24160 1726853555.01820: checking for max_fail_percentage 24160 1726853555.01822: done checking for max_fail_percentage 24160 1726853555.01823: checking to see if all hosts have failed and the running result is not ok 24160 1726853555.01823: done checking to see if all hosts have failed 24160 1726853555.01824: getting the remaining hosts for this loop 24160 1726853555.01825: done getting the remaining hosts for this loop 24160 1726853555.01829: getting the next task for host managed_node1 24160 1726853555.01836: done getting next task for host managed_node1 24160 1726853555.01839: ^ task is: TASK: meta (role_complete) 24160 1726853555.01841: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853555.01850: getting variables 24160 1726853555.01851: in VariableManager get_vars() 24160 1726853555.01898: Calling all_inventory to load vars for managed_node1 24160 1726853555.01901: Calling groups_inventory to load vars for managed_node1 24160 1726853555.01904: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853555.01914: Calling all_plugins_play to load vars for managed_node1 24160 1726853555.01916: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853555.01919: Calling groups_plugins_play to load vars for managed_node1 24160 1726853555.03337: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853555.05068: done with get_vars() 24160 1726853555.05093: done getting variables 24160 1726853555.05184: done queuing things up, now waiting for results queue to drain 24160 1726853555.05187: results queue empty 24160 1726853555.05188: checking for any_errors_fatal 24160 1726853555.05190: done checking for any_errors_fatal 24160 1726853555.05191: checking for max_fail_percentage 24160 1726853555.05192: done checking for max_fail_percentage 24160 1726853555.05193: checking to see if all hosts have failed and the running result is not ok 24160 1726853555.05194: done checking to see if all hosts have failed 24160 1726853555.05194: getting the remaining hosts for this loop 24160 1726853555.05195: done getting the remaining hosts for this loop 24160 1726853555.05198: getting the next task for host managed_node1 24160 1726853555.05202: done getting next task for host managed_node1 24160 1726853555.05203: ^ task is: TASK: meta (flush_handlers) 24160 1726853555.05205: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853555.05208: getting variables 24160 1726853555.05209: in VariableManager get_vars() 24160 1726853555.05229: Calling all_inventory to load vars for managed_node1 24160 1726853555.05231: Calling groups_inventory to load vars for managed_node1 24160 1726853555.05233: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853555.05238: Calling all_plugins_play to load vars for managed_node1 24160 1726853555.05241: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853555.05243: Calling groups_plugins_play to load vars for managed_node1 24160 1726853555.06564: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853555.08276: done with get_vars() 24160 1726853555.08295: done getting variables 24160 1726853555.08349: in VariableManager get_vars() 24160 1726853555.08361: Calling all_inventory to load vars for managed_node1 24160 1726853555.08363: Calling groups_inventory to load vars for managed_node1 24160 1726853555.08365: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853555.08370: Calling all_plugins_play to load vars for managed_node1 24160 1726853555.08374: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853555.08377: Calling groups_plugins_play to load vars for managed_node1 24160 1726853555.09588: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853555.11231: done with get_vars() 24160 1726853555.11256: done queuing things up, now waiting for results queue to drain 24160 1726853555.11258: results queue empty 24160 1726853555.11259: checking for any_errors_fatal 24160 1726853555.11260: done checking for any_errors_fatal 24160 1726853555.11261: checking for max_fail_percentage 24160 1726853555.11262: done checking for max_fail_percentage 24160 1726853555.11263: checking to see if all hosts have failed and the running result is not ok 24160 1726853555.11264: done checking to see if all hosts have failed 24160 1726853555.11264: getting the remaining hosts for this loop 24160 1726853555.11265: done getting the remaining hosts for this loop 24160 1726853555.11268: getting the next task for host managed_node1 24160 1726853555.11277: done getting next task for host managed_node1 24160 1726853555.11282: ^ task is: TASK: meta (flush_handlers) 24160 1726853555.11284: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853555.11287: getting variables 24160 1726853555.11288: in VariableManager get_vars() 24160 1726853555.11300: Calling all_inventory to load vars for managed_node1 24160 1726853555.11302: Calling groups_inventory to load vars for managed_node1 24160 1726853555.11304: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853555.11310: Calling all_plugins_play to load vars for managed_node1 24160 1726853555.11313: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853555.11316: Calling groups_plugins_play to load vars for managed_node1 24160 1726853555.12612: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853555.14262: done with get_vars() 24160 1726853555.14284: done getting variables 24160 1726853555.14338: in VariableManager get_vars() 24160 1726853555.14350: Calling all_inventory to load vars for managed_node1 24160 1726853555.14352: Calling groups_inventory to load vars for managed_node1 24160 1726853555.14354: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853555.14358: Calling all_plugins_play to load vars for managed_node1 24160 1726853555.14360: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853555.14363: Calling groups_plugins_play to load vars for managed_node1 24160 1726853555.15527: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853555.16880: done with get_vars() 24160 1726853555.16897: done queuing things up, now waiting for results queue to drain 24160 1726853555.16899: results queue empty 24160 1726853555.16899: checking for any_errors_fatal 24160 1726853555.16900: done checking for any_errors_fatal 24160 1726853555.16901: checking for max_fail_percentage 24160 1726853555.16901: done checking for max_fail_percentage 24160 1726853555.16902: checking to see if all hosts have failed and the running result is not ok 24160 1726853555.16902: done checking to see if all hosts have failed 24160 1726853555.16903: getting the remaining hosts for this loop 24160 1726853555.16903: done getting the remaining hosts for this loop 24160 1726853555.16905: getting the next task for host managed_node1 24160 1726853555.16907: done getting next task for host managed_node1 24160 1726853555.16908: ^ task is: None 24160 1726853555.16909: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853555.16910: done queuing things up, now waiting for results queue to drain 24160 1726853555.16910: results queue empty 24160 1726853555.16910: checking for any_errors_fatal 24160 1726853555.16911: done checking for any_errors_fatal 24160 1726853555.16911: checking for max_fail_percentage 24160 1726853555.16912: done checking for max_fail_percentage 24160 1726853555.16912: checking to see if all hosts have failed and the running result is not ok 24160 1726853555.16913: done checking to see if all hosts have failed 24160 1726853555.16913: getting the next task for host managed_node1 24160 1726853555.16915: done getting next task for host managed_node1 24160 1726853555.16915: ^ task is: None 24160 1726853555.16916: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853555.16949: in VariableManager get_vars() 24160 1726853555.16961: done with get_vars() 24160 1726853555.16965: in VariableManager get_vars() 24160 1726853555.16973: done with get_vars() 24160 1726853555.16976: variable 'omit' from source: magic vars 24160 1726853555.16996: in VariableManager get_vars() 24160 1726853555.17002: done with get_vars() 24160 1726853555.17016: variable 'omit' from source: magic vars PLAY [Delete the interface, then assert that device and profile are absent] **** 24160 1726853555.17180: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 24160 1726853555.17198: getting the remaining hosts for this loop 24160 1726853555.17199: done getting the remaining hosts for this loop 24160 1726853555.17201: getting the next task for host managed_node1 24160 1726853555.17203: done getting next task for host managed_node1 24160 1726853555.17204: ^ task is: TASK: Gathering Facts 24160 1726853555.17205: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853555.17207: getting variables 24160 1726853555.17207: in VariableManager get_vars() 24160 1726853555.17213: Calling all_inventory to load vars for managed_node1 24160 1726853555.17214: Calling groups_inventory to load vars for managed_node1 24160 1726853555.17215: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853555.17219: Calling all_plugins_play to load vars for managed_node1 24160 1726853555.17220: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853555.17222: Calling groups_plugins_play to load vars for managed_node1 24160 1726853555.17884: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853555.19088: done with get_vars() 24160 1726853555.19108: done getting variables 24160 1726853555.19139: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:80 Friday 20 September 2024 13:32:35 -0400 (0:00:00.492) 0:00:31.594 ****** 24160 1726853555.19156: entering _queue_task() for managed_node1/gather_facts 24160 1726853555.19490: worker is 1 (out of 1 available) 24160 1726853555.19503: exiting _queue_task() for managed_node1/gather_facts 24160 1726853555.19517: done queuing things up, now waiting for results queue to drain 24160 1726853555.19519: waiting for pending results... 24160 1726853555.19988: running TaskExecutor() for managed_node1/TASK: Gathering Facts 24160 1726853555.19993: in run() - task 02083763-bbaf-5676-4eb4-0000000005ee 24160 1726853555.19996: variable 'ansible_search_path' from source: unknown 24160 1726853555.19999: calling self._execute() 24160 1726853555.20060: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853555.20075: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853555.20089: variable 'omit' from source: magic vars 24160 1726853555.20468: variable 'ansible_distribution_major_version' from source: facts 24160 1726853555.20480: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853555.20487: variable 'omit' from source: magic vars 24160 1726853555.20509: variable 'omit' from source: magic vars 24160 1726853555.20537: variable 'omit' from source: magic vars 24160 1726853555.20575: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24160 1726853555.20601: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24160 1726853555.20618: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24160 1726853555.20633: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853555.20643: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853555.20669: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24160 1726853555.20674: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853555.20677: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853555.20747: Set connection var ansible_shell_executable to /bin/sh 24160 1726853555.20751: Set connection var ansible_pipelining to False 24160 1726853555.20756: Set connection var ansible_connection to ssh 24160 1726853555.20758: Set connection var ansible_shell_type to sh 24160 1726853555.20766: Set connection var ansible_module_compression to ZIP_DEFLATED 24160 1726853555.20774: Set connection var ansible_timeout to 10 24160 1726853555.20790: variable 'ansible_shell_executable' from source: unknown 24160 1726853555.20793: variable 'ansible_connection' from source: unknown 24160 1726853555.20796: variable 'ansible_module_compression' from source: unknown 24160 1726853555.20799: variable 'ansible_shell_type' from source: unknown 24160 1726853555.20801: variable 'ansible_shell_executable' from source: unknown 24160 1726853555.20804: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853555.20806: variable 'ansible_pipelining' from source: unknown 24160 1726853555.20810: variable 'ansible_timeout' from source: unknown 24160 1726853555.20814: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853555.20943: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 24160 1726853555.20951: variable 'omit' from source: magic vars 24160 1726853555.20958: starting attempt loop 24160 1726853555.20962: running the handler 24160 1726853555.20974: variable 'ansible_facts' from source: unknown 24160 1726853555.20992: _low_level_execute_command(): starting 24160 1726853555.20999: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24160 1726853555.21485: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853555.21490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853555.21493: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853555.21537: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853555.21555: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853555.21598: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853555.23277: stdout chunk (state=3): >>>/root <<< 24160 1726853555.23365: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853555.23395: stderr chunk (state=3): >>><<< 24160 1726853555.23398: stdout chunk (state=3): >>><<< 24160 1726853555.23417: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853555.23427: _low_level_execute_command(): starting 24160 1726853555.23433: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853555.234161-25690-168352324416608 `" && echo ansible-tmp-1726853555.234161-25690-168352324416608="` echo /root/.ansible/tmp/ansible-tmp-1726853555.234161-25690-168352324416608 `" ) && sleep 0' 24160 1726853555.23841: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853555.23844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 24160 1726853555.23846: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 24160 1726853555.23857: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853555.23901: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853555.23904: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853555.23947: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853555.25826: stdout chunk (state=3): >>>ansible-tmp-1726853555.234161-25690-168352324416608=/root/.ansible/tmp/ansible-tmp-1726853555.234161-25690-168352324416608 <<< 24160 1726853555.25933: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853555.25953: stderr chunk (state=3): >>><<< 24160 1726853555.25959: stdout chunk (state=3): >>><<< 24160 1726853555.25972: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853555.234161-25690-168352324416608=/root/.ansible/tmp/ansible-tmp-1726853555.234161-25690-168352324416608 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853555.25999: variable 'ansible_module_compression' from source: unknown 24160 1726853555.26035: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24160jdl187cr/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 24160 1726853555.26088: variable 'ansible_facts' from source: unknown 24160 1726853555.26220: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853555.234161-25690-168352324416608/AnsiballZ_setup.py 24160 1726853555.26317: Sending initial data 24160 1726853555.26320: Sent initial data (153 bytes) 24160 1726853555.26737: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853555.26740: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853555.26743: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration <<< 24160 1726853555.26745: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24160 1726853555.26747: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853555.26797: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853555.26800: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853555.26845: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853555.28383: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 <<< 24160 1726853555.28387: stderr chunk (state=3): >>>debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24160 1726853555.28418: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24160 1726853555.28457: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24160jdl187cr/tmpl7u8gt47 /root/.ansible/tmp/ansible-tmp-1726853555.234161-25690-168352324416608/AnsiballZ_setup.py <<< 24160 1726853555.28461: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853555.234161-25690-168352324416608/AnsiballZ_setup.py" <<< 24160 1726853555.28503: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24160jdl187cr/tmpl7u8gt47" to remote "/root/.ansible/tmp/ansible-tmp-1726853555.234161-25690-168352324416608/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853555.234161-25690-168352324416608/AnsiballZ_setup.py" <<< 24160 1726853555.29510: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853555.29542: stderr chunk (state=3): >>><<< 24160 1726853555.29546: stdout chunk (state=3): >>><<< 24160 1726853555.29565: done transferring module to remote 24160 1726853555.29576: _low_level_execute_command(): starting 24160 1726853555.29579: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853555.234161-25690-168352324416608/ /root/.ansible/tmp/ansible-tmp-1726853555.234161-25690-168352324416608/AnsiballZ_setup.py && sleep 0' 24160 1726853555.29984: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853555.29988: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853555.29990: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853555.29992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853555.30043: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853555.30046: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853555.30091: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853555.31802: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853555.31823: stderr chunk (state=3): >>><<< 24160 1726853555.31826: stdout chunk (state=3): >>><<< 24160 1726853555.31844: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853555.31847: _low_level_execute_command(): starting 24160 1726853555.31850: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853555.234161-25690-168352324416608/AnsiballZ_setup.py && sleep 0' 24160 1726853555.32258: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853555.32261: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 24160 1726853555.32263: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 24160 1726853555.32277: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853555.32280: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853555.32315: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853555.32318: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853555.32365: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853555.96250: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-153", "ansible_nodename": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26b9e88796a7cb9ebdc2656ce384f6", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_lsb": {}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 44238 10.31.45.153 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 44238 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_apparmor": {"status": "disabled"}, "ansible_local": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCXYnrsBaUY4i0/t1QUWoZkFPnFbGncbkmF01/zUZNuwldCwqYoDxpR2K8ARraEuK9oVLyYO0alCszdGP42db6R4xfRCOhN3faeZXsneupsJk4LLpIBkq0uIokeAtcL1tPOUQQzfsQqzZzp4BmJCVrwmUW5ADnzqCgvB3gsruyTQUrEUJ9MtB5zdaQm5MXuipjeZQThTjYCB2aXv/qTdzfKAwds3CoSZ6HA5GNdi6tahsy3CRIn6VtVkvwrqjJcwo+RaRQzjh+C9AUoH2YQmfLbvog62MsnLk/5OPq5HhxO81pm/TJDsI4LXwLh1VviMOWzVvIaXuKwdmYAdgX1NU561bBzeYzi55qBKo4TcMmnOXiV+Es7dDrKjwwpQKsv5tjSqVkcO6ek3I6SI38DXFOBLZtqXOOLsO12iOReYJUWe/+cgBrz12kDCPmaHFzFFZ3+N0GQ/WiYcgqiUItIhb3xJTbPqls0czPCpCzKo57GyTmv17fpfGhBfRoGX/H1zYs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDOnt+7F+RlMaCRRXs8YyuhiHP1FmeDlj4rN<<< 24160 1726853555.96276: stdout chunk (state=3): >>>B/K2mg1iP9loXXc/XjJ083xMBDu7m7uYLGB+dnmj299Y+RcAQpE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKmxoIJtw8UORlc+o+Q7Pks5ERSyhMLnl+Oo8W221WGj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_loadavg": {"1m": 0.48779296875, "5m": 0.369140625, "15m": 0.19970703125}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "32", "second": "35", "epoch": "1726853555", "epoch_int": "1726853555", "date": "2024-09-20", "time": "13:32:35", "iso8601_micro": "2024-09-20T17:32:35.597320Z", "iso8601": "2024-09-20T17:32:35Z", "iso8601_basic": "20240920T133235597320", "iso8601_basic_short": "20240920T133235", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fibre_channel_wwn": [], "ansible_fips": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2956, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 575, "free": 2956}, "nocache": {"free": 3295, "used": 236}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_uuid": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 721, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794676736, "block_size": 4096, "block_total": 65519099, "block_available": 63914716, "block_used": 1604383, "inode_total": 131070960, "inode_available": 131029067, "inode_used": 41893, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_pkg_mgr": "dnf", "ansible_iscsi_iqn": "", "ansible_interfaces": ["peerethtest0", "eth0", "ethtest0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", <<< 24160 1726853555.96309: stdout chunk (state=3): >>>"prefix": "22"}, "ipv6": [{"address": "fe80::3a:e7ff:fe40:bc9f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_ethtest0": {"device": "ethtest0", "macaddress": "46:cb:a9:60:52:30", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::44cb:a9ff:fe60:5230", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerethtest0": {"device": "peerethtest0", "macaddress": "ba:78:39:74:8a:f6", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::b878:39ff:fe74:8af6", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_r<<< 24160 1726853555.96325: stdout chunk (state=3): >>>eceive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.153"], "ansible_all_ipv6_addresses": ["fe80::3a:e7ff:fe40:bc9f", "fe80::44cb:a9ff:fe60:5230", "fe80::b878:39ff:fe74:8af6"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.153", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::3a:e7ff:fe40:bc9f", "fe80::44cb:a9ff:fe60:5230", "fe80::b878:39ff:fe74:8af6"]}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_is_chroot": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 24160 1726853555.98241: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 24160 1726853555.98278: stderr chunk (state=3): >>><<< 24160 1726853555.98282: stdout chunk (state=3): >>><<< 24160 1726853555.98323: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-153", "ansible_nodename": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26b9e88796a7cb9ebdc2656ce384f6", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_lsb": {}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 44238 10.31.45.153 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 44238 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_apparmor": {"status": "disabled"}, "ansible_local": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCXYnrsBaUY4i0/t1QUWoZkFPnFbGncbkmF01/zUZNuwldCwqYoDxpR2K8ARraEuK9oVLyYO0alCszdGP42db6R4xfRCOhN3faeZXsneupsJk4LLpIBkq0uIokeAtcL1tPOUQQzfsQqzZzp4BmJCVrwmUW5ADnzqCgvB3gsruyTQUrEUJ9MtB5zdaQm5MXuipjeZQThTjYCB2aXv/qTdzfKAwds3CoSZ6HA5GNdi6tahsy3CRIn6VtVkvwrqjJcwo+RaRQzjh+C9AUoH2YQmfLbvog62MsnLk/5OPq5HhxO81pm/TJDsI4LXwLh1VviMOWzVvIaXuKwdmYAdgX1NU561bBzeYzi55qBKo4TcMmnOXiV+Es7dDrKjwwpQKsv5tjSqVkcO6ek3I6SI38DXFOBLZtqXOOLsO12iOReYJUWe/+cgBrz12kDCPmaHFzFFZ3+N0GQ/WiYcgqiUItIhb3xJTbPqls0czPCpCzKo57GyTmv17fpfGhBfRoGX/H1zYs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDOnt+7F+RlMaCRRXs8YyuhiHP1FmeDlj4rNB/K2mg1iP9loXXc/XjJ083xMBDu7m7uYLGB+dnmj299Y+RcAQpE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKmxoIJtw8UORlc+o+Q7Pks5ERSyhMLnl+Oo8W221WGj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_loadavg": {"1m": 0.48779296875, "5m": 0.369140625, "15m": 0.19970703125}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "32", "second": "35", "epoch": "1726853555", "epoch_int": "1726853555", "date": "2024-09-20", "time": "13:32:35", "iso8601_micro": "2024-09-20T17:32:35.597320Z", "iso8601": "2024-09-20T17:32:35Z", "iso8601_basic": "20240920T133235597320", "iso8601_basic_short": "20240920T133235", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fibre_channel_wwn": [], "ansible_fips": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2956, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 575, "free": 2956}, "nocache": {"free": 3295, "used": 236}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_uuid": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 721, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794676736, "block_size": 4096, "block_total": 65519099, "block_available": 63914716, "block_used": 1604383, "inode_total": 131070960, "inode_available": 131029067, "inode_used": 41893, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_pkg_mgr": "dnf", "ansible_iscsi_iqn": "", "ansible_interfaces": ["peerethtest0", "eth0", "ethtest0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::3a:e7ff:fe40:bc9f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_ethtest0": {"device": "ethtest0", "macaddress": "46:cb:a9:60:52:30", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::44cb:a9ff:fe60:5230", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerethtest0": {"device": "peerethtest0", "macaddress": "ba:78:39:74:8a:f6", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::b878:39ff:fe74:8af6", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.153"], "ansible_all_ipv6_addresses": ["fe80::3a:e7ff:fe40:bc9f", "fe80::44cb:a9ff:fe60:5230", "fe80::b878:39ff:fe74:8af6"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.153", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::3a:e7ff:fe40:bc9f", "fe80::44cb:a9ff:fe60:5230", "fe80::b878:39ff:fe74:8af6"]}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_is_chroot": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 24160 1726853555.98613: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853555.234161-25690-168352324416608/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24160 1726853555.98631: _low_level_execute_command(): starting 24160 1726853555.98635: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853555.234161-25690-168352324416608/ > /dev/null 2>&1 && sleep 0' 24160 1726853555.99096: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853555.99099: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 24160 1726853555.99102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 24160 1726853555.99104: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24160 1726853555.99106: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853555.99159: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853555.99162: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853555.99166: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853555.99207: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853556.00990: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853556.01015: stderr chunk (state=3): >>><<< 24160 1726853556.01018: stdout chunk (state=3): >>><<< 24160 1726853556.01032: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853556.01043: handler run complete 24160 1726853556.01133: variable 'ansible_facts' from source: unknown 24160 1726853556.01214: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853556.01425: variable 'ansible_facts' from source: unknown 24160 1726853556.01488: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853556.01589: attempt loop complete, returning result 24160 1726853556.01593: _execute() done 24160 1726853556.01596: dumping result to json 24160 1726853556.01619: done dumping result, returning 24160 1726853556.01626: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [02083763-bbaf-5676-4eb4-0000000005ee] 24160 1726853556.01628: sending task result for task 02083763-bbaf-5676-4eb4-0000000005ee 24160 1726853556.02067: done sending task result for task 02083763-bbaf-5676-4eb4-0000000005ee 24160 1726853556.02070: WORKER PROCESS EXITING ok: [managed_node1] 24160 1726853556.02328: no more pending results, returning what we have 24160 1726853556.02330: results queue empty 24160 1726853556.02330: checking for any_errors_fatal 24160 1726853556.02331: done checking for any_errors_fatal 24160 1726853556.02332: checking for max_fail_percentage 24160 1726853556.02333: done checking for max_fail_percentage 24160 1726853556.02333: checking to see if all hosts have failed and the running result is not ok 24160 1726853556.02334: done checking to see if all hosts have failed 24160 1726853556.02334: getting the remaining hosts for this loop 24160 1726853556.02335: done getting the remaining hosts for this loop 24160 1726853556.02337: getting the next task for host managed_node1 24160 1726853556.02341: done getting next task for host managed_node1 24160 1726853556.02342: ^ task is: TASK: meta (flush_handlers) 24160 1726853556.02344: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853556.02347: getting variables 24160 1726853556.02348: in VariableManager get_vars() 24160 1726853556.02364: Calling all_inventory to load vars for managed_node1 24160 1726853556.02366: Calling groups_inventory to load vars for managed_node1 24160 1726853556.02368: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853556.02377: Calling all_plugins_play to load vars for managed_node1 24160 1726853556.02379: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853556.02381: Calling groups_plugins_play to load vars for managed_node1 24160 1726853556.03166: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853556.04040: done with get_vars() 24160 1726853556.04055: done getting variables 24160 1726853556.04105: in VariableManager get_vars() 24160 1726853556.04111: Calling all_inventory to load vars for managed_node1 24160 1726853556.04113: Calling groups_inventory to load vars for managed_node1 24160 1726853556.04114: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853556.04118: Calling all_plugins_play to load vars for managed_node1 24160 1726853556.04119: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853556.04120: Calling groups_plugins_play to load vars for managed_node1 24160 1726853556.04752: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853556.05706: done with get_vars() 24160 1726853556.05723: done queuing things up, now waiting for results queue to drain 24160 1726853556.05725: results queue empty 24160 1726853556.05725: checking for any_errors_fatal 24160 1726853556.05728: done checking for any_errors_fatal 24160 1726853556.05733: checking for max_fail_percentage 24160 1726853556.05734: done checking for max_fail_percentage 24160 1726853556.05734: checking to see if all hosts have failed and the running result is not ok 24160 1726853556.05735: done checking to see if all hosts have failed 24160 1726853556.05735: getting the remaining hosts for this loop 24160 1726853556.05736: done getting the remaining hosts for this loop 24160 1726853556.05737: getting the next task for host managed_node1 24160 1726853556.05740: done getting next task for host managed_node1 24160 1726853556.05742: ^ task is: TASK: Include the task 'delete_interface.yml' 24160 1726853556.05743: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853556.05744: getting variables 24160 1726853556.05745: in VariableManager get_vars() 24160 1726853556.05751: Calling all_inventory to load vars for managed_node1 24160 1726853556.05752: Calling groups_inventory to load vars for managed_node1 24160 1726853556.05754: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853556.05759: Calling all_plugins_play to load vars for managed_node1 24160 1726853556.05761: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853556.05764: Calling groups_plugins_play to load vars for managed_node1 24160 1726853556.06395: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853556.07241: done with get_vars() 24160 1726853556.07254: done getting variables TASK [Include the task 'delete_interface.yml'] ********************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:83 Friday 20 September 2024 13:32:36 -0400 (0:00:00.881) 0:00:32.475 ****** 24160 1726853556.07311: entering _queue_task() for managed_node1/include_tasks 24160 1726853556.07567: worker is 1 (out of 1 available) 24160 1726853556.07581: exiting _queue_task() for managed_node1/include_tasks 24160 1726853556.07594: done queuing things up, now waiting for results queue to drain 24160 1726853556.07595: waiting for pending results... 24160 1726853556.07780: running TaskExecutor() for managed_node1/TASK: Include the task 'delete_interface.yml' 24160 1726853556.07856: in run() - task 02083763-bbaf-5676-4eb4-00000000009c 24160 1726853556.07873: variable 'ansible_search_path' from source: unknown 24160 1726853556.07903: calling self._execute() 24160 1726853556.07976: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853556.07980: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853556.07989: variable 'omit' from source: magic vars 24160 1726853556.08281: variable 'ansible_distribution_major_version' from source: facts 24160 1726853556.08290: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853556.08297: _execute() done 24160 1726853556.08300: dumping result to json 24160 1726853556.08304: done dumping result, returning 24160 1726853556.08310: done running TaskExecutor() for managed_node1/TASK: Include the task 'delete_interface.yml' [02083763-bbaf-5676-4eb4-00000000009c] 24160 1726853556.08315: sending task result for task 02083763-bbaf-5676-4eb4-00000000009c 24160 1726853556.08400: done sending task result for task 02083763-bbaf-5676-4eb4-00000000009c 24160 1726853556.08403: WORKER PROCESS EXITING 24160 1726853556.08428: no more pending results, returning what we have 24160 1726853556.08432: in VariableManager get_vars() 24160 1726853556.08464: Calling all_inventory to load vars for managed_node1 24160 1726853556.08467: Calling groups_inventory to load vars for managed_node1 24160 1726853556.08470: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853556.08484: Calling all_plugins_play to load vars for managed_node1 24160 1726853556.08487: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853556.08490: Calling groups_plugins_play to load vars for managed_node1 24160 1726853556.09330: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853556.10197: done with get_vars() 24160 1726853556.10211: variable 'ansible_search_path' from source: unknown 24160 1726853556.10221: we have included files to process 24160 1726853556.10222: generating all_blocks data 24160 1726853556.10223: done generating all_blocks data 24160 1726853556.10223: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 24160 1726853556.10224: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 24160 1726853556.10225: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 24160 1726853556.10375: done processing included file 24160 1726853556.10377: iterating over new_blocks loaded from include file 24160 1726853556.10378: in VariableManager get_vars() 24160 1726853556.10385: done with get_vars() 24160 1726853556.10386: filtering new block on tags 24160 1726853556.10395: done filtering new block on tags 24160 1726853556.10396: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml for managed_node1 24160 1726853556.10400: extending task lists for all hosts with included blocks 24160 1726853556.10447: done extending task lists 24160 1726853556.10448: done processing included files 24160 1726853556.10448: results queue empty 24160 1726853556.10449: checking for any_errors_fatal 24160 1726853556.10449: done checking for any_errors_fatal 24160 1726853556.10450: checking for max_fail_percentage 24160 1726853556.10450: done checking for max_fail_percentage 24160 1726853556.10451: checking to see if all hosts have failed and the running result is not ok 24160 1726853556.10451: done checking to see if all hosts have failed 24160 1726853556.10452: getting the remaining hosts for this loop 24160 1726853556.10453: done getting the remaining hosts for this loop 24160 1726853556.10455: getting the next task for host managed_node1 24160 1726853556.10457: done getting next task for host managed_node1 24160 1726853556.10459: ^ task is: TASK: Remove test interface if necessary 24160 1726853556.10460: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853556.10462: getting variables 24160 1726853556.10462: in VariableManager get_vars() 24160 1726853556.10468: Calling all_inventory to load vars for managed_node1 24160 1726853556.10469: Calling groups_inventory to load vars for managed_node1 24160 1726853556.10472: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853556.10476: Calling all_plugins_play to load vars for managed_node1 24160 1726853556.10477: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853556.10479: Calling groups_plugins_play to load vars for managed_node1 24160 1726853556.11109: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853556.12135: done with get_vars() 24160 1726853556.12148: done getting variables 24160 1726853556.12179: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Remove test interface if necessary] ************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml:3 Friday 20 September 2024 13:32:36 -0400 (0:00:00.048) 0:00:32.524 ****** 24160 1726853556.12199: entering _queue_task() for managed_node1/command 24160 1726853556.12429: worker is 1 (out of 1 available) 24160 1726853556.12442: exiting _queue_task() for managed_node1/command 24160 1726853556.12456: done queuing things up, now waiting for results queue to drain 24160 1726853556.12458: waiting for pending results... 24160 1726853556.12624: running TaskExecutor() for managed_node1/TASK: Remove test interface if necessary 24160 1726853556.12695: in run() - task 02083763-bbaf-5676-4eb4-0000000005ff 24160 1726853556.12706: variable 'ansible_search_path' from source: unknown 24160 1726853556.12710: variable 'ansible_search_path' from source: unknown 24160 1726853556.12737: calling self._execute() 24160 1726853556.12813: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853556.12817: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853556.12826: variable 'omit' from source: magic vars 24160 1726853556.13102: variable 'ansible_distribution_major_version' from source: facts 24160 1726853556.13113: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853556.13118: variable 'omit' from source: magic vars 24160 1726853556.13145: variable 'omit' from source: magic vars 24160 1726853556.13212: variable 'interface' from source: set_fact 24160 1726853556.13226: variable 'omit' from source: magic vars 24160 1726853556.13261: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24160 1726853556.13287: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24160 1726853556.13316: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24160 1726853556.13356: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853556.13576: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853556.13580: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24160 1726853556.13582: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853556.13585: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853556.13587: Set connection var ansible_shell_executable to /bin/sh 24160 1726853556.13589: Set connection var ansible_pipelining to False 24160 1726853556.13591: Set connection var ansible_connection to ssh 24160 1726853556.13593: Set connection var ansible_shell_type to sh 24160 1726853556.13595: Set connection var ansible_module_compression to ZIP_DEFLATED 24160 1726853556.13597: Set connection var ansible_timeout to 10 24160 1726853556.13599: variable 'ansible_shell_executable' from source: unknown 24160 1726853556.13610: variable 'ansible_connection' from source: unknown 24160 1726853556.13618: variable 'ansible_module_compression' from source: unknown 24160 1726853556.13625: variable 'ansible_shell_type' from source: unknown 24160 1726853556.13631: variable 'ansible_shell_executable' from source: unknown 24160 1726853556.13637: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853556.13644: variable 'ansible_pipelining' from source: unknown 24160 1726853556.13650: variable 'ansible_timeout' from source: unknown 24160 1726853556.13657: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853556.13806: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 24160 1726853556.13833: variable 'omit' from source: magic vars 24160 1726853556.13843: starting attempt loop 24160 1726853556.13849: running the handler 24160 1726853556.13868: _low_level_execute_command(): starting 24160 1726853556.13881: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24160 1726853556.14621: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853556.14635: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853556.14651: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853556.14710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853556.14779: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853556.14804: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853556.14838: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853556.14904: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853556.16568: stdout chunk (state=3): >>>/root <<< 24160 1726853556.16775: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853556.16779: stdout chunk (state=3): >>><<< 24160 1726853556.16781: stderr chunk (state=3): >>><<< 24160 1726853556.16785: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853556.16787: _low_level_execute_command(): starting 24160 1726853556.16789: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853556.167116-25715-191058346321213 `" && echo ansible-tmp-1726853556.167116-25715-191058346321213="` echo /root/.ansible/tmp/ansible-tmp-1726853556.167116-25715-191058346321213 `" ) && sleep 0' 24160 1726853556.17136: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853556.17177: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853556.17180: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 24160 1726853556.17190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853556.17193: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration <<< 24160 1726853556.17195: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 24160 1726853556.17204: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853556.17244: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853556.17249: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853556.17293: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853556.19185: stdout chunk (state=3): >>>ansible-tmp-1726853556.167116-25715-191058346321213=/root/.ansible/tmp/ansible-tmp-1726853556.167116-25715-191058346321213 <<< 24160 1726853556.19350: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853556.19353: stdout chunk (state=3): >>><<< 24160 1726853556.19355: stderr chunk (state=3): >>><<< 24160 1726853556.19378: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853556.167116-25715-191058346321213=/root/.ansible/tmp/ansible-tmp-1726853556.167116-25715-191058346321213 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853556.19418: variable 'ansible_module_compression' from source: unknown 24160 1726853556.19561: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24160jdl187cr/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 24160 1726853556.19565: variable 'ansible_facts' from source: unknown 24160 1726853556.19625: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853556.167116-25715-191058346321213/AnsiballZ_command.py 24160 1726853556.19800: Sending initial data 24160 1726853556.19813: Sent initial data (155 bytes) 24160 1726853556.20411: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853556.20425: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853556.20479: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 24160 1726853556.20555: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853556.20602: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853556.20666: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853556.22246: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24160 1726853556.22267: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24160 1726853556.22315: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24160jdl187cr/tmppthkqduv /root/.ansible/tmp/ansible-tmp-1726853556.167116-25715-191058346321213/AnsiballZ_command.py <<< 24160 1726853556.22318: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853556.167116-25715-191058346321213/AnsiballZ_command.py" <<< 24160 1726853556.22368: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24160jdl187cr/tmppthkqduv" to remote "/root/.ansible/tmp/ansible-tmp-1726853556.167116-25715-191058346321213/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853556.167116-25715-191058346321213/AnsiballZ_command.py" <<< 24160 1726853556.23243: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853556.23247: stdout chunk (state=3): >>><<< 24160 1726853556.23249: stderr chunk (state=3): >>><<< 24160 1726853556.23258: done transferring module to remote 24160 1726853556.23276: _low_level_execute_command(): starting 24160 1726853556.23287: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853556.167116-25715-191058346321213/ /root/.ansible/tmp/ansible-tmp-1726853556.167116-25715-191058346321213/AnsiballZ_command.py && sleep 0' 24160 1726853556.23943: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853556.23956: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853556.23970: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853556.23990: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24160 1726853556.24015: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 24160 1726853556.24121: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853556.24143: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853556.24209: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853556.25928: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853556.25982: stderr chunk (state=3): >>><<< 24160 1726853556.25998: stdout chunk (state=3): >>><<< 24160 1726853556.26020: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853556.26030: _low_level_execute_command(): starting 24160 1726853556.26039: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853556.167116-25715-191058346321213/AnsiballZ_command.py && sleep 0' 24160 1726853556.26623: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853556.26637: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853556.26650: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853556.26670: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24160 1726853556.26690: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 24160 1726853556.26699: stderr chunk (state=3): >>>debug2: match not found <<< 24160 1726853556.26788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853556.26805: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853556.26970: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853556.43279: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "ethtest0"], "start": "2024-09-20 13:32:36.420694", "end": "2024-09-20 13:32:36.431251", "delta": "0:00:00.010557", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del ethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 24160 1726853556.45621: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 24160 1726853556.45640: stdout chunk (state=3): >>><<< 24160 1726853556.45664: stderr chunk (state=3): >>><<< 24160 1726853556.45696: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "ethtest0"], "start": "2024-09-20 13:32:36.420694", "end": "2024-09-20 13:32:36.431251", "delta": "0:00:00.010557", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del ethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 24160 1726853556.45841: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853556.167116-25715-191058346321213/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24160 1726853556.45845: _low_level_execute_command(): starting 24160 1726853556.45847: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853556.167116-25715-191058346321213/ > /dev/null 2>&1 && sleep 0' 24160 1726853556.46424: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853556.46438: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853556.46453: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853556.46481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24160 1726853556.46502: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 24160 1726853556.46585: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853556.46604: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853556.46622: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853556.46794: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853556.46892: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853556.48853: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853556.48861: stdout chunk (state=3): >>><<< 24160 1726853556.48863: stderr chunk (state=3): >>><<< 24160 1726853556.49090: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853556.49094: handler run complete 24160 1726853556.49096: Evaluated conditional (False): False 24160 1726853556.49098: attempt loop complete, returning result 24160 1726853556.49100: _execute() done 24160 1726853556.49102: dumping result to json 24160 1726853556.49104: done dumping result, returning 24160 1726853556.49106: done running TaskExecutor() for managed_node1/TASK: Remove test interface if necessary [02083763-bbaf-5676-4eb4-0000000005ff] 24160 1726853556.49108: sending task result for task 02083763-bbaf-5676-4eb4-0000000005ff ok: [managed_node1] => { "changed": false, "cmd": [ "ip", "link", "del", "ethtest0" ], "delta": "0:00:00.010557", "end": "2024-09-20 13:32:36.431251", "rc": 0, "start": "2024-09-20 13:32:36.420694" } 24160 1726853556.49376: no more pending results, returning what we have 24160 1726853556.49380: results queue empty 24160 1726853556.49381: checking for any_errors_fatal 24160 1726853556.49383: done checking for any_errors_fatal 24160 1726853556.49383: checking for max_fail_percentage 24160 1726853556.49385: done checking for max_fail_percentage 24160 1726853556.49386: checking to see if all hosts have failed and the running result is not ok 24160 1726853556.49386: done checking to see if all hosts have failed 24160 1726853556.49387: getting the remaining hosts for this loop 24160 1726853556.49388: done getting the remaining hosts for this loop 24160 1726853556.49391: getting the next task for host managed_node1 24160 1726853556.49398: done getting next task for host managed_node1 24160 1726853556.49401: ^ task is: TASK: Include the task 'assert_profile_absent.yml' 24160 1726853556.49403: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853556.49408: getting variables 24160 1726853556.49414: in VariableManager get_vars() 24160 1726853556.49442: Calling all_inventory to load vars for managed_node1 24160 1726853556.49445: Calling groups_inventory to load vars for managed_node1 24160 1726853556.49448: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853556.49463: Calling all_plugins_play to load vars for managed_node1 24160 1726853556.49465: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853556.49468: Calling groups_plugins_play to load vars for managed_node1 24160 1726853556.50479: done sending task result for task 02083763-bbaf-5676-4eb4-0000000005ff 24160 1726853556.50483: WORKER PROCESS EXITING 24160 1726853556.53117: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853556.56129: done with get_vars() 24160 1726853556.56157: done getting variables TASK [Include the task 'assert_profile_absent.yml'] **************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:85 Friday 20 September 2024 13:32:36 -0400 (0:00:00.440) 0:00:32.965 ****** 24160 1726853556.56263: entering _queue_task() for managed_node1/include_tasks 24160 1726853556.56761: worker is 1 (out of 1 available) 24160 1726853556.56795: exiting _queue_task() for managed_node1/include_tasks 24160 1726853556.56808: done queuing things up, now waiting for results queue to drain 24160 1726853556.56810: waiting for pending results... 24160 1726853556.57037: running TaskExecutor() for managed_node1/TASK: Include the task 'assert_profile_absent.yml' 24160 1726853556.57143: in run() - task 02083763-bbaf-5676-4eb4-00000000009d 24160 1726853556.57165: variable 'ansible_search_path' from source: unknown 24160 1726853556.57208: calling self._execute() 24160 1726853556.57306: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853556.57318: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853556.57332: variable 'omit' from source: magic vars 24160 1726853556.57700: variable 'ansible_distribution_major_version' from source: facts 24160 1726853556.57718: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853556.57728: _execute() done 24160 1726853556.57736: dumping result to json 24160 1726853556.57743: done dumping result, returning 24160 1726853556.57977: done running TaskExecutor() for managed_node1/TASK: Include the task 'assert_profile_absent.yml' [02083763-bbaf-5676-4eb4-00000000009d] 24160 1726853556.57981: sending task result for task 02083763-bbaf-5676-4eb4-00000000009d 24160 1726853556.58043: done sending task result for task 02083763-bbaf-5676-4eb4-00000000009d 24160 1726853556.58046: WORKER PROCESS EXITING 24160 1726853556.58073: no more pending results, returning what we have 24160 1726853556.58077: in VariableManager get_vars() 24160 1726853556.58105: Calling all_inventory to load vars for managed_node1 24160 1726853556.58107: Calling groups_inventory to load vars for managed_node1 24160 1726853556.58110: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853556.58119: Calling all_plugins_play to load vars for managed_node1 24160 1726853556.58121: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853556.58150: Calling groups_plugins_play to load vars for managed_node1 24160 1726853556.59415: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853556.61080: done with get_vars() 24160 1726853556.61099: variable 'ansible_search_path' from source: unknown 24160 1726853556.61115: we have included files to process 24160 1726853556.61116: generating all_blocks data 24160 1726853556.61118: done generating all_blocks data 24160 1726853556.61123: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 24160 1726853556.61124: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 24160 1726853556.61127: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 24160 1726853556.61284: in VariableManager get_vars() 24160 1726853556.61303: done with get_vars() 24160 1726853556.61413: done processing included file 24160 1726853556.61415: iterating over new_blocks loaded from include file 24160 1726853556.61417: in VariableManager get_vars() 24160 1726853556.61428: done with get_vars() 24160 1726853556.61430: filtering new block on tags 24160 1726853556.61447: done filtering new block on tags 24160 1726853556.61449: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed_node1 24160 1726853556.61454: extending task lists for all hosts with included blocks 24160 1726853556.61594: done extending task lists 24160 1726853556.61596: done processing included files 24160 1726853556.61597: results queue empty 24160 1726853556.61597: checking for any_errors_fatal 24160 1726853556.61602: done checking for any_errors_fatal 24160 1726853556.61603: checking for max_fail_percentage 24160 1726853556.61604: done checking for max_fail_percentage 24160 1726853556.61605: checking to see if all hosts have failed and the running result is not ok 24160 1726853556.61606: done checking to see if all hosts have failed 24160 1726853556.61607: getting the remaining hosts for this loop 24160 1726853556.61608: done getting the remaining hosts for this loop 24160 1726853556.61610: getting the next task for host managed_node1 24160 1726853556.61614: done getting next task for host managed_node1 24160 1726853556.61616: ^ task is: TASK: Include the task 'get_profile_stat.yml' 24160 1726853556.61619: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853556.61621: getting variables 24160 1726853556.61622: in VariableManager get_vars() 24160 1726853556.61631: Calling all_inventory to load vars for managed_node1 24160 1726853556.61633: Calling groups_inventory to load vars for managed_node1 24160 1726853556.61635: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853556.61640: Calling all_plugins_play to load vars for managed_node1 24160 1726853556.61642: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853556.61645: Calling groups_plugins_play to load vars for managed_node1 24160 1726853556.62766: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853556.64276: done with get_vars() 24160 1726853556.64299: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Friday 20 September 2024 13:32:36 -0400 (0:00:00.081) 0:00:33.046 ****** 24160 1726853556.64377: entering _queue_task() for managed_node1/include_tasks 24160 1726853556.64724: worker is 1 (out of 1 available) 24160 1726853556.64735: exiting _queue_task() for managed_node1/include_tasks 24160 1726853556.64748: done queuing things up, now waiting for results queue to drain 24160 1726853556.64750: waiting for pending results... 24160 1726853556.65035: running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' 24160 1726853556.65147: in run() - task 02083763-bbaf-5676-4eb4-000000000612 24160 1726853556.65169: variable 'ansible_search_path' from source: unknown 24160 1726853556.65181: variable 'ansible_search_path' from source: unknown 24160 1726853556.65226: calling self._execute() 24160 1726853556.65332: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853556.65345: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853556.65359: variable 'omit' from source: magic vars 24160 1726853556.65757: variable 'ansible_distribution_major_version' from source: facts 24160 1726853556.65778: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853556.65790: _execute() done 24160 1726853556.65799: dumping result to json 24160 1726853556.65807: done dumping result, returning 24160 1726853556.65817: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' [02083763-bbaf-5676-4eb4-000000000612] 24160 1726853556.65828: sending task result for task 02083763-bbaf-5676-4eb4-000000000612 24160 1726853556.65979: no more pending results, returning what we have 24160 1726853556.65985: in VariableManager get_vars() 24160 1726853556.66020: Calling all_inventory to load vars for managed_node1 24160 1726853556.66023: Calling groups_inventory to load vars for managed_node1 24160 1726853556.66027: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853556.66040: Calling all_plugins_play to load vars for managed_node1 24160 1726853556.66044: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853556.66048: Calling groups_plugins_play to load vars for managed_node1 24160 1726853556.66784: done sending task result for task 02083763-bbaf-5676-4eb4-000000000612 24160 1726853556.66787: WORKER PROCESS EXITING 24160 1726853556.67674: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853556.69247: done with get_vars() 24160 1726853556.69266: variable 'ansible_search_path' from source: unknown 24160 1726853556.69267: variable 'ansible_search_path' from source: unknown 24160 1726853556.69305: we have included files to process 24160 1726853556.69307: generating all_blocks data 24160 1726853556.69308: done generating all_blocks data 24160 1726853556.69309: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 24160 1726853556.69310: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 24160 1726853556.69312: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 24160 1726853556.70293: done processing included file 24160 1726853556.70295: iterating over new_blocks loaded from include file 24160 1726853556.70296: in VariableManager get_vars() 24160 1726853556.70309: done with get_vars() 24160 1726853556.70310: filtering new block on tags 24160 1726853556.70332: done filtering new block on tags 24160 1726853556.70334: in VariableManager get_vars() 24160 1726853556.70346: done with get_vars() 24160 1726853556.70347: filtering new block on tags 24160 1726853556.70367: done filtering new block on tags 24160 1726853556.70369: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node1 24160 1726853556.70376: extending task lists for all hosts with included blocks 24160 1726853556.70468: done extending task lists 24160 1726853556.70470: done processing included files 24160 1726853556.70472: results queue empty 24160 1726853556.70473: checking for any_errors_fatal 24160 1726853556.70475: done checking for any_errors_fatal 24160 1726853556.70476: checking for max_fail_percentage 24160 1726853556.70477: done checking for max_fail_percentage 24160 1726853556.70478: checking to see if all hosts have failed and the running result is not ok 24160 1726853556.70479: done checking to see if all hosts have failed 24160 1726853556.70479: getting the remaining hosts for this loop 24160 1726853556.70480: done getting the remaining hosts for this loop 24160 1726853556.70483: getting the next task for host managed_node1 24160 1726853556.70487: done getting next task for host managed_node1 24160 1726853556.70489: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 24160 1726853556.70492: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853556.70494: getting variables 24160 1726853556.70495: in VariableManager get_vars() 24160 1726853556.70551: Calling all_inventory to load vars for managed_node1 24160 1726853556.70554: Calling groups_inventory to load vars for managed_node1 24160 1726853556.70557: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853556.70562: Calling all_plugins_play to load vars for managed_node1 24160 1726853556.70565: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853556.70568: Calling groups_plugins_play to load vars for managed_node1 24160 1726853556.71624: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853556.73122: done with get_vars() 24160 1726853556.73143: done getting variables 24160 1726853556.73185: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 13:32:36 -0400 (0:00:00.088) 0:00:33.134 ****** 24160 1726853556.73217: entering _queue_task() for managed_node1/set_fact 24160 1726853556.73557: worker is 1 (out of 1 available) 24160 1726853556.73569: exiting _queue_task() for managed_node1/set_fact 24160 1726853556.73584: done queuing things up, now waiting for results queue to drain 24160 1726853556.73585: waiting for pending results... 24160 1726853556.73848: running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag 24160 1726853556.73966: in run() - task 02083763-bbaf-5676-4eb4-00000000062a 24160 1726853556.73991: variable 'ansible_search_path' from source: unknown 24160 1726853556.74000: variable 'ansible_search_path' from source: unknown 24160 1726853556.74035: calling self._execute() 24160 1726853556.74130: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853556.74141: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853556.74155: variable 'omit' from source: magic vars 24160 1726853556.74543: variable 'ansible_distribution_major_version' from source: facts 24160 1726853556.74562: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853556.74576: variable 'omit' from source: magic vars 24160 1726853556.74633: variable 'omit' from source: magic vars 24160 1726853556.74676: variable 'omit' from source: magic vars 24160 1726853556.74722: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24160 1726853556.74761: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24160 1726853556.74789: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24160 1726853556.74813: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853556.74836: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853556.74873: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24160 1726853556.74883: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853556.74895: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853556.75002: Set connection var ansible_shell_executable to /bin/sh 24160 1726853556.75015: Set connection var ansible_pipelining to False 24160 1726853556.75023: Set connection var ansible_connection to ssh 24160 1726853556.75029: Set connection var ansible_shell_type to sh 24160 1726853556.75048: Set connection var ansible_module_compression to ZIP_DEFLATED 24160 1726853556.75063: Set connection var ansible_timeout to 10 24160 1726853556.75091: variable 'ansible_shell_executable' from source: unknown 24160 1726853556.75102: variable 'ansible_connection' from source: unknown 24160 1726853556.75111: variable 'ansible_module_compression' from source: unknown 24160 1726853556.75118: variable 'ansible_shell_type' from source: unknown 24160 1726853556.75125: variable 'ansible_shell_executable' from source: unknown 24160 1726853556.75132: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853556.75140: variable 'ansible_pipelining' from source: unknown 24160 1726853556.75147: variable 'ansible_timeout' from source: unknown 24160 1726853556.75160: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853556.75305: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 24160 1726853556.75373: variable 'omit' from source: magic vars 24160 1726853556.75377: starting attempt loop 24160 1726853556.75379: running the handler 24160 1726853556.75382: handler run complete 24160 1726853556.75384: attempt loop complete, returning result 24160 1726853556.75386: _execute() done 24160 1726853556.75389: dumping result to json 24160 1726853556.75391: done dumping result, returning 24160 1726853556.75393: done running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag [02083763-bbaf-5676-4eb4-00000000062a] 24160 1726853556.75395: sending task result for task 02083763-bbaf-5676-4eb4-00000000062a ok: [managed_node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 24160 1726853556.75528: no more pending results, returning what we have 24160 1726853556.75532: results queue empty 24160 1726853556.75533: checking for any_errors_fatal 24160 1726853556.75535: done checking for any_errors_fatal 24160 1726853556.75535: checking for max_fail_percentage 24160 1726853556.75537: done checking for max_fail_percentage 24160 1726853556.75538: checking to see if all hosts have failed and the running result is not ok 24160 1726853556.75539: done checking to see if all hosts have failed 24160 1726853556.75539: getting the remaining hosts for this loop 24160 1726853556.75541: done getting the remaining hosts for this loop 24160 1726853556.75544: getting the next task for host managed_node1 24160 1726853556.75551: done getting next task for host managed_node1 24160 1726853556.75553: ^ task is: TASK: Stat profile file 24160 1726853556.75557: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853556.75562: getting variables 24160 1726853556.75564: in VariableManager get_vars() 24160 1726853556.75594: Calling all_inventory to load vars for managed_node1 24160 1726853556.75598: Calling groups_inventory to load vars for managed_node1 24160 1726853556.75601: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853556.75614: Calling all_plugins_play to load vars for managed_node1 24160 1726853556.75617: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853556.75620: Calling groups_plugins_play to load vars for managed_node1 24160 1726853556.76475: done sending task result for task 02083763-bbaf-5676-4eb4-00000000062a 24160 1726853556.76479: WORKER PROCESS EXITING 24160 1726853556.81276: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853556.82785: done with get_vars() 24160 1726853556.82808: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 13:32:36 -0400 (0:00:00.096) 0:00:33.231 ****** 24160 1726853556.82881: entering _queue_task() for managed_node1/stat 24160 1726853556.83218: worker is 1 (out of 1 available) 24160 1726853556.83230: exiting _queue_task() for managed_node1/stat 24160 1726853556.83243: done queuing things up, now waiting for results queue to drain 24160 1726853556.83245: waiting for pending results... 24160 1726853556.83524: running TaskExecutor() for managed_node1/TASK: Stat profile file 24160 1726853556.83659: in run() - task 02083763-bbaf-5676-4eb4-00000000062b 24160 1726853556.83685: variable 'ansible_search_path' from source: unknown 24160 1726853556.83698: variable 'ansible_search_path' from source: unknown 24160 1726853556.83876: calling self._execute() 24160 1726853556.83879: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853556.83882: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853556.83885: variable 'omit' from source: magic vars 24160 1726853556.84263: variable 'ansible_distribution_major_version' from source: facts 24160 1726853556.84284: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853556.84297: variable 'omit' from source: magic vars 24160 1726853556.84361: variable 'omit' from source: magic vars 24160 1726853556.84462: variable 'profile' from source: include params 24160 1726853556.84473: variable 'interface' from source: set_fact 24160 1726853556.84548: variable 'interface' from source: set_fact 24160 1726853556.84575: variable 'omit' from source: magic vars 24160 1726853556.84618: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24160 1726853556.84661: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24160 1726853556.84687: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24160 1726853556.84709: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853556.84727: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853556.84766: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24160 1726853556.84866: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853556.84869: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853556.84893: Set connection var ansible_shell_executable to /bin/sh 24160 1726853556.84906: Set connection var ansible_pipelining to False 24160 1726853556.84914: Set connection var ansible_connection to ssh 24160 1726853556.84922: Set connection var ansible_shell_type to sh 24160 1726853556.84934: Set connection var ansible_module_compression to ZIP_DEFLATED 24160 1726853556.84949: Set connection var ansible_timeout to 10 24160 1726853556.84981: variable 'ansible_shell_executable' from source: unknown 24160 1726853556.84990: variable 'ansible_connection' from source: unknown 24160 1726853556.84997: variable 'ansible_module_compression' from source: unknown 24160 1726853556.85004: variable 'ansible_shell_type' from source: unknown 24160 1726853556.85012: variable 'ansible_shell_executable' from source: unknown 24160 1726853556.85020: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853556.85027: variable 'ansible_pipelining' from source: unknown 24160 1726853556.85036: variable 'ansible_timeout' from source: unknown 24160 1726853556.85044: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853556.85251: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 24160 1726853556.85268: variable 'omit' from source: magic vars 24160 1726853556.85282: starting attempt loop 24160 1726853556.85299: running the handler 24160 1726853556.85376: _low_level_execute_command(): starting 24160 1726853556.85379: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24160 1726853556.86042: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853556.86074: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24160 1726853556.86174: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853556.86186: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853556.86206: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853556.86293: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853556.87974: stdout chunk (state=3): >>>/root <<< 24160 1726853556.88068: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853556.88125: stderr chunk (state=3): >>><<< 24160 1726853556.88136: stdout chunk (state=3): >>><<< 24160 1726853556.88167: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853556.88188: _low_level_execute_command(): starting 24160 1726853556.88200: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853556.8817546-25741-275435671421694 `" && echo ansible-tmp-1726853556.8817546-25741-275435671421694="` echo /root/.ansible/tmp/ansible-tmp-1726853556.8817546-25741-275435671421694 `" ) && sleep 0' 24160 1726853556.88789: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853556.88805: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853556.88827: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853556.88848: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24160 1726853556.88877: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 24160 1726853556.88890: stderr chunk (state=3): >>>debug2: match not found <<< 24160 1726853556.88904: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853556.88932: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24160 1726853556.88990: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853556.89051: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853556.89067: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853556.89153: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853556.91017: stdout chunk (state=3): >>>ansible-tmp-1726853556.8817546-25741-275435671421694=/root/.ansible/tmp/ansible-tmp-1726853556.8817546-25741-275435671421694 <<< 24160 1726853556.91159: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853556.91185: stdout chunk (state=3): >>><<< 24160 1726853556.91188: stderr chunk (state=3): >>><<< 24160 1726853556.91376: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853556.8817546-25741-275435671421694=/root/.ansible/tmp/ansible-tmp-1726853556.8817546-25741-275435671421694 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853556.91379: variable 'ansible_module_compression' from source: unknown 24160 1726853556.91382: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24160jdl187cr/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 24160 1726853556.91384: variable 'ansible_facts' from source: unknown 24160 1726853556.91446: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853556.8817546-25741-275435671421694/AnsiballZ_stat.py 24160 1726853556.91632: Sending initial data 24160 1726853556.91636: Sent initial data (153 bytes) 24160 1726853556.92276: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853556.92427: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853556.92444: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853556.92466: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853556.92536: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853556.94119: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24160 1726853556.94208: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24160 1726853556.94438: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24160jdl187cr/tmp7gwfzfd1 /root/.ansible/tmp/ansible-tmp-1726853556.8817546-25741-275435671421694/AnsiballZ_stat.py <<< 24160 1726853556.94474: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853556.8817546-25741-275435671421694/AnsiballZ_stat.py" <<< 24160 1726853556.94478: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24160jdl187cr/tmp7gwfzfd1" to remote "/root/.ansible/tmp/ansible-tmp-1726853556.8817546-25741-275435671421694/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853556.8817546-25741-275435671421694/AnsiballZ_stat.py" <<< 24160 1726853556.95654: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853556.95825: stderr chunk (state=3): >>><<< 24160 1726853556.95829: stdout chunk (state=3): >>><<< 24160 1726853556.95832: done transferring module to remote 24160 1726853556.95835: _low_level_execute_command(): starting 24160 1726853556.95837: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853556.8817546-25741-275435671421694/ /root/.ansible/tmp/ansible-tmp-1726853556.8817546-25741-275435671421694/AnsiballZ_stat.py && sleep 0' 24160 1726853556.96886: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853556.96937: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853556.96953: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853556.96991: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853556.97063: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853556.99086: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853556.99090: stdout chunk (state=3): >>><<< 24160 1726853556.99092: stderr chunk (state=3): >>><<< 24160 1726853556.99096: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853556.99098: _low_level_execute_command(): starting 24160 1726853556.99101: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853556.8817546-25741-275435671421694/AnsiballZ_stat.py && sleep 0' 24160 1726853557.00318: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853557.00322: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853557.00324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853557.00326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24160 1726853557.00328: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 24160 1726853557.00331: stderr chunk (state=3): >>>debug2: match not found <<< 24160 1726853557.00344: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853557.00387: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853557.00403: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853557.00476: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853557.15716: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 24160 1726853557.17069: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 24160 1726853557.17083: stdout chunk (state=3): >>><<< 24160 1726853557.17096: stderr chunk (state=3): >>><<< 24160 1726853557.17180: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 24160 1726853557.17215: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853556.8817546-25741-275435671421694/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24160 1726853557.17235: _low_level_execute_command(): starting 24160 1726853557.17550: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853556.8817546-25741-275435671421694/ > /dev/null 2>&1 && sleep 0' 24160 1726853557.18611: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853557.18689: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853557.18700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853557.18714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24160 1726853557.18765: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 24160 1726853557.18778: stderr chunk (state=3): >>>debug2: match not found <<< 24160 1726853557.18785: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853557.18801: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24160 1726853557.18808: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address <<< 24160 1726853557.18815: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24160 1726853557.18825: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853557.19018: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853557.19026: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853557.19097: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853557.20922: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853557.20966: stderr chunk (state=3): >>><<< 24160 1726853557.21033: stdout chunk (state=3): >>><<< 24160 1726853557.21234: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853557.21237: handler run complete 24160 1726853557.21240: attempt loop complete, returning result 24160 1726853557.21243: _execute() done 24160 1726853557.21245: dumping result to json 24160 1726853557.21247: done dumping result, returning 24160 1726853557.21249: done running TaskExecutor() for managed_node1/TASK: Stat profile file [02083763-bbaf-5676-4eb4-00000000062b] 24160 1726853557.21251: sending task result for task 02083763-bbaf-5676-4eb4-00000000062b 24160 1726853557.21443: done sending task result for task 02083763-bbaf-5676-4eb4-00000000062b 24160 1726853557.21447: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 24160 1726853557.21534: no more pending results, returning what we have 24160 1726853557.21538: results queue empty 24160 1726853557.21539: checking for any_errors_fatal 24160 1726853557.21547: done checking for any_errors_fatal 24160 1726853557.21548: checking for max_fail_percentage 24160 1726853557.21550: done checking for max_fail_percentage 24160 1726853557.21551: checking to see if all hosts have failed and the running result is not ok 24160 1726853557.21552: done checking to see if all hosts have failed 24160 1726853557.21552: getting the remaining hosts for this loop 24160 1726853557.21556: done getting the remaining hosts for this loop 24160 1726853557.21561: getting the next task for host managed_node1 24160 1726853557.21568: done getting next task for host managed_node1 24160 1726853557.21573: ^ task is: TASK: Set NM profile exist flag based on the profile files 24160 1726853557.21577: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853557.21583: getting variables 24160 1726853557.21584: in VariableManager get_vars() 24160 1726853557.21617: Calling all_inventory to load vars for managed_node1 24160 1726853557.21620: Calling groups_inventory to load vars for managed_node1 24160 1726853557.21624: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853557.21636: Calling all_plugins_play to load vars for managed_node1 24160 1726853557.21639: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853557.21642: Calling groups_plugins_play to load vars for managed_node1 24160 1726853557.24802: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853557.28266: done with get_vars() 24160 1726853557.28443: done getting variables 24160 1726853557.28515: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 13:32:37 -0400 (0:00:00.457) 0:00:33.688 ****** 24160 1726853557.28607: entering _queue_task() for managed_node1/set_fact 24160 1726853557.29394: worker is 1 (out of 1 available) 24160 1726853557.29406: exiting _queue_task() for managed_node1/set_fact 24160 1726853557.29419: done queuing things up, now waiting for results queue to drain 24160 1726853557.29421: waiting for pending results... 24160 1726853557.29769: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files 24160 1726853557.30232: in run() - task 02083763-bbaf-5676-4eb4-00000000062c 24160 1726853557.30236: variable 'ansible_search_path' from source: unknown 24160 1726853557.30238: variable 'ansible_search_path' from source: unknown 24160 1726853557.30241: calling self._execute() 24160 1726853557.30556: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853557.30560: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853557.30563: variable 'omit' from source: magic vars 24160 1726853557.31261: variable 'ansible_distribution_major_version' from source: facts 24160 1726853557.31334: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853557.31679: variable 'profile_stat' from source: set_fact 24160 1726853557.31694: Evaluated conditional (profile_stat.stat.exists): False 24160 1726853557.31698: when evaluation is False, skipping this task 24160 1726853557.31701: _execute() done 24160 1726853557.31704: dumping result to json 24160 1726853557.31707: done dumping result, returning 24160 1726853557.31710: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files [02083763-bbaf-5676-4eb4-00000000062c] 24160 1726853557.31717: sending task result for task 02083763-bbaf-5676-4eb4-00000000062c 24160 1726853557.31816: done sending task result for task 02083763-bbaf-5676-4eb4-00000000062c 24160 1726853557.31820: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 24160 1726853557.31869: no more pending results, returning what we have 24160 1726853557.31875: results queue empty 24160 1726853557.31877: checking for any_errors_fatal 24160 1726853557.31886: done checking for any_errors_fatal 24160 1726853557.31886: checking for max_fail_percentage 24160 1726853557.31888: done checking for max_fail_percentage 24160 1726853557.31889: checking to see if all hosts have failed and the running result is not ok 24160 1726853557.31889: done checking to see if all hosts have failed 24160 1726853557.31890: getting the remaining hosts for this loop 24160 1726853557.31891: done getting the remaining hosts for this loop 24160 1726853557.31895: getting the next task for host managed_node1 24160 1726853557.31979: done getting next task for host managed_node1 24160 1726853557.31983: ^ task is: TASK: Get NM profile info 24160 1726853557.31987: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853557.31993: getting variables 24160 1726853557.31995: in VariableManager get_vars() 24160 1726853557.32030: Calling all_inventory to load vars for managed_node1 24160 1726853557.32033: Calling groups_inventory to load vars for managed_node1 24160 1726853557.32037: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853557.32049: Calling all_plugins_play to load vars for managed_node1 24160 1726853557.32052: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853557.32057: Calling groups_plugins_play to load vars for managed_node1 24160 1726853557.35136: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853557.38443: done with get_vars() 24160 1726853557.38470: done getting variables 24160 1726853557.38628: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 13:32:37 -0400 (0:00:00.100) 0:00:33.789 ****** 24160 1726853557.38661: entering _queue_task() for managed_node1/shell 24160 1726853557.39445: worker is 1 (out of 1 available) 24160 1726853557.39576: exiting _queue_task() for managed_node1/shell 24160 1726853557.39589: done queuing things up, now waiting for results queue to drain 24160 1726853557.39590: waiting for pending results... 24160 1726853557.40024: running TaskExecutor() for managed_node1/TASK: Get NM profile info 24160 1726853557.40305: in run() - task 02083763-bbaf-5676-4eb4-00000000062d 24160 1726853557.40322: variable 'ansible_search_path' from source: unknown 24160 1726853557.40326: variable 'ansible_search_path' from source: unknown 24160 1726853557.40364: calling self._execute() 24160 1726853557.40535: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853557.40548: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853557.40568: variable 'omit' from source: magic vars 24160 1726853557.41038: variable 'ansible_distribution_major_version' from source: facts 24160 1726853557.41042: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853557.41045: variable 'omit' from source: magic vars 24160 1726853557.41089: variable 'omit' from source: magic vars 24160 1726853557.41200: variable 'profile' from source: include params 24160 1726853557.41210: variable 'interface' from source: set_fact 24160 1726853557.41365: variable 'interface' from source: set_fact 24160 1726853557.41368: variable 'omit' from source: magic vars 24160 1726853557.41373: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24160 1726853557.41394: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24160 1726853557.41419: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24160 1726853557.41443: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853557.41459: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853557.41498: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24160 1726853557.41508: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853557.41516: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853557.41620: Set connection var ansible_shell_executable to /bin/sh 24160 1726853557.41631: Set connection var ansible_pipelining to False 24160 1726853557.41638: Set connection var ansible_connection to ssh 24160 1726853557.41645: Set connection var ansible_shell_type to sh 24160 1726853557.41657: Set connection var ansible_module_compression to ZIP_DEFLATED 24160 1726853557.41673: Set connection var ansible_timeout to 10 24160 1726853557.41777: variable 'ansible_shell_executable' from source: unknown 24160 1726853557.41781: variable 'ansible_connection' from source: unknown 24160 1726853557.41783: variable 'ansible_module_compression' from source: unknown 24160 1726853557.41785: variable 'ansible_shell_type' from source: unknown 24160 1726853557.41787: variable 'ansible_shell_executable' from source: unknown 24160 1726853557.41789: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853557.41791: variable 'ansible_pipelining' from source: unknown 24160 1726853557.41793: variable 'ansible_timeout' from source: unknown 24160 1726853557.41796: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853557.41863: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 24160 1726853557.41880: variable 'omit' from source: magic vars 24160 1726853557.41890: starting attempt loop 24160 1726853557.41902: running the handler 24160 1726853557.41918: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 24160 1726853557.41941: _low_level_execute_command(): starting 24160 1726853557.41955: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24160 1726853557.43203: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853557.43207: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853557.43390: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853557.43449: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853557.45130: stdout chunk (state=3): >>>/root <<< 24160 1726853557.45228: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853557.45267: stderr chunk (state=3): >>><<< 24160 1726853557.45281: stdout chunk (state=3): >>><<< 24160 1726853557.45309: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853557.45330: _low_level_execute_command(): starting 24160 1726853557.45342: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853557.4531648-25760-104913346055049 `" && echo ansible-tmp-1726853557.4531648-25760-104913346055049="` echo /root/.ansible/tmp/ansible-tmp-1726853557.4531648-25760-104913346055049 `" ) && sleep 0' 24160 1726853557.45985: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853557.46000: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853557.46030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853557.46266: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853557.46366: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853557.46526: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853557.48421: stdout chunk (state=3): >>>ansible-tmp-1726853557.4531648-25760-104913346055049=/root/.ansible/tmp/ansible-tmp-1726853557.4531648-25760-104913346055049 <<< 24160 1726853557.48543: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853557.48555: stdout chunk (state=3): >>><<< 24160 1726853557.48575: stderr chunk (state=3): >>><<< 24160 1726853557.48600: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853557.4531648-25760-104913346055049=/root/.ansible/tmp/ansible-tmp-1726853557.4531648-25760-104913346055049 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853557.48636: variable 'ansible_module_compression' from source: unknown 24160 1726853557.48700: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24160jdl187cr/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 24160 1726853557.48745: variable 'ansible_facts' from source: unknown 24160 1726853557.48836: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853557.4531648-25760-104913346055049/AnsiballZ_command.py 24160 1726853557.48998: Sending initial data 24160 1726853557.49001: Sent initial data (156 bytes) 24160 1726853557.49976: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853557.49992: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853557.50008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853557.50066: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24160 1726853557.50279: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853557.50297: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853557.50314: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853557.50333: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853557.50404: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853557.51947: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 24160 1726853557.51961: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 24160 1726853557.51976: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 24160 1726853557.51994: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24160 1726853557.52057: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24160 1726853557.52119: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24160jdl187cr/tmpyfz5u2yi /root/.ansible/tmp/ansible-tmp-1726853557.4531648-25760-104913346055049/AnsiballZ_command.py <<< 24160 1726853557.52149: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853557.4531648-25760-104913346055049/AnsiballZ_command.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24160jdl187cr/tmpyfz5u2yi" to remote "/root/.ansible/tmp/ansible-tmp-1726853557.4531648-25760-104913346055049/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853557.4531648-25760-104913346055049/AnsiballZ_command.py" <<< 24160 1726853557.52787: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853557.52819: stderr chunk (state=3): >>><<< 24160 1726853557.52822: stdout chunk (state=3): >>><<< 24160 1726853557.52837: done transferring module to remote 24160 1726853557.52845: _low_level_execute_command(): starting 24160 1726853557.52850: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853557.4531648-25760-104913346055049/ /root/.ansible/tmp/ansible-tmp-1726853557.4531648-25760-104913346055049/AnsiballZ_command.py && sleep 0' 24160 1726853557.53256: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853557.53260: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 24160 1726853557.53266: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853557.53268: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 24160 1726853557.53272: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853557.53311: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853557.53320: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853557.53360: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853557.55137: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853557.55210: stderr chunk (state=3): >>><<< 24160 1726853557.55214: stdout chunk (state=3): >>><<< 24160 1726853557.55216: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853557.55228: _low_level_execute_command(): starting 24160 1726853557.55231: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853557.4531648-25760-104913346055049/AnsiballZ_command.py && sleep 0' 24160 1726853557.55789: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853557.55801: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853557.55816: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853557.55834: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24160 1726853557.55851: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 24160 1726853557.55862: stderr chunk (state=3): >>>debug2: match not found <<< 24160 1726853557.55879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853557.55898: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24160 1726853557.55925: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853557.55959: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853557.56033: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853557.56051: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853557.56098: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853557.56173: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853557.72791: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "start": "2024-09-20 13:32:37.710815", "end": "2024-09-20 13:32:37.726993", "delta": "0:00:00.016178", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 24160 1726853557.74294: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.45.153 closed. <<< 24160 1726853557.74328: stderr chunk (state=3): >>><<< 24160 1726853557.74332: stdout chunk (state=3): >>><<< 24160 1726853557.74349: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "start": "2024-09-20 13:32:37.710815", "end": "2024-09-20 13:32:37.726993", "delta": "0:00:00.016178", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.45.153 closed. 24160 1726853557.74382: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853557.4531648-25760-104913346055049/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24160 1726853557.74392: _low_level_execute_command(): starting 24160 1726853557.74397: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853557.4531648-25760-104913346055049/ > /dev/null 2>&1 && sleep 0' 24160 1726853557.74847: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853557.74852: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 24160 1726853557.74859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853557.74861: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853557.74863: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853557.74913: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853557.74920: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853557.74922: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853557.74960: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853557.76790: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853557.76817: stderr chunk (state=3): >>><<< 24160 1726853557.76821: stdout chunk (state=3): >>><<< 24160 1726853557.76834: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853557.76840: handler run complete 24160 1726853557.76856: Evaluated conditional (False): False 24160 1726853557.76867: attempt loop complete, returning result 24160 1726853557.76870: _execute() done 24160 1726853557.76874: dumping result to json 24160 1726853557.76879: done dumping result, returning 24160 1726853557.76886: done running TaskExecutor() for managed_node1/TASK: Get NM profile info [02083763-bbaf-5676-4eb4-00000000062d] 24160 1726853557.76889: sending task result for task 02083763-bbaf-5676-4eb4-00000000062d 24160 1726853557.76987: done sending task result for task 02083763-bbaf-5676-4eb4-00000000062d 24160 1726853557.76989: WORKER PROCESS EXITING fatal: [managed_node1]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "delta": "0:00:00.016178", "end": "2024-09-20 13:32:37.726993", "rc": 1, "start": "2024-09-20 13:32:37.710815" } MSG: non-zero return code ...ignoring 24160 1726853557.77086: no more pending results, returning what we have 24160 1726853557.77089: results queue empty 24160 1726853557.77091: checking for any_errors_fatal 24160 1726853557.77098: done checking for any_errors_fatal 24160 1726853557.77099: checking for max_fail_percentage 24160 1726853557.77101: done checking for max_fail_percentage 24160 1726853557.77102: checking to see if all hosts have failed and the running result is not ok 24160 1726853557.77102: done checking to see if all hosts have failed 24160 1726853557.77103: getting the remaining hosts for this loop 24160 1726853557.77104: done getting the remaining hosts for this loop 24160 1726853557.77107: getting the next task for host managed_node1 24160 1726853557.77114: done getting next task for host managed_node1 24160 1726853557.77116: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 24160 1726853557.77121: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853557.77125: getting variables 24160 1726853557.77126: in VariableManager get_vars() 24160 1726853557.77152: Calling all_inventory to load vars for managed_node1 24160 1726853557.77155: Calling groups_inventory to load vars for managed_node1 24160 1726853557.77158: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853557.77167: Calling all_plugins_play to load vars for managed_node1 24160 1726853557.77170: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853557.77180: Calling groups_plugins_play to load vars for managed_node1 24160 1726853557.77964: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853557.78935: done with get_vars() 24160 1726853557.78950: done getting variables 24160 1726853557.78993: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 13:32:37 -0400 (0:00:00.403) 0:00:34.192 ****** 24160 1726853557.79017: entering _queue_task() for managed_node1/set_fact 24160 1726853557.79239: worker is 1 (out of 1 available) 24160 1726853557.79253: exiting _queue_task() for managed_node1/set_fact 24160 1726853557.79264: done queuing things up, now waiting for results queue to drain 24160 1726853557.79266: waiting for pending results... 24160 1726853557.79442: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 24160 1726853557.79525: in run() - task 02083763-bbaf-5676-4eb4-00000000062e 24160 1726853557.79538: variable 'ansible_search_path' from source: unknown 24160 1726853557.79541: variable 'ansible_search_path' from source: unknown 24160 1726853557.79573: calling self._execute() 24160 1726853557.79651: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853557.79654: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853557.79666: variable 'omit' from source: magic vars 24160 1726853557.79962: variable 'ansible_distribution_major_version' from source: facts 24160 1726853557.79973: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853557.80064: variable 'nm_profile_exists' from source: set_fact 24160 1726853557.80078: Evaluated conditional (nm_profile_exists.rc == 0): False 24160 1726853557.80081: when evaluation is False, skipping this task 24160 1726853557.80084: _execute() done 24160 1726853557.80087: dumping result to json 24160 1726853557.80090: done dumping result, returning 24160 1726853557.80098: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [02083763-bbaf-5676-4eb4-00000000062e] 24160 1726853557.80101: sending task result for task 02083763-bbaf-5676-4eb4-00000000062e skipping: [managed_node1] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 24160 1726853557.80228: no more pending results, returning what we have 24160 1726853557.80233: results queue empty 24160 1726853557.80234: checking for any_errors_fatal 24160 1726853557.80245: done checking for any_errors_fatal 24160 1726853557.80245: checking for max_fail_percentage 24160 1726853557.80247: done checking for max_fail_percentage 24160 1726853557.80248: checking to see if all hosts have failed and the running result is not ok 24160 1726853557.80248: done checking to see if all hosts have failed 24160 1726853557.80249: getting the remaining hosts for this loop 24160 1726853557.80250: done getting the remaining hosts for this loop 24160 1726853557.80253: getting the next task for host managed_node1 24160 1726853557.80262: done getting next task for host managed_node1 24160 1726853557.80264: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 24160 1726853557.80268: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853557.80272: getting variables 24160 1726853557.80274: in VariableManager get_vars() 24160 1726853557.80299: Calling all_inventory to load vars for managed_node1 24160 1726853557.80301: Calling groups_inventory to load vars for managed_node1 24160 1726853557.80304: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853557.80313: Calling all_plugins_play to load vars for managed_node1 24160 1726853557.80315: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853557.80317: Calling groups_plugins_play to load vars for managed_node1 24160 1726853557.81082: done sending task result for task 02083763-bbaf-5676-4eb4-00000000062e 24160 1726853557.81086: WORKER PROCESS EXITING 24160 1726853557.81096: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853557.81983: done with get_vars() 24160 1726853557.81998: done getting variables 24160 1726853557.82039: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 24160 1726853557.82126: variable 'profile' from source: include params 24160 1726853557.82129: variable 'interface' from source: set_fact 24160 1726853557.82174: variable 'interface' from source: set_fact TASK [Get the ansible_managed comment in ifcfg-ethtest0] *********************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 13:32:37 -0400 (0:00:00.031) 0:00:34.224 ****** 24160 1726853557.82196: entering _queue_task() for managed_node1/command 24160 1726853557.82392: worker is 1 (out of 1 available) 24160 1726853557.82405: exiting _queue_task() for managed_node1/command 24160 1726853557.82417: done queuing things up, now waiting for results queue to drain 24160 1726853557.82418: waiting for pending results... 24160 1726853557.82598: running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-ethtest0 24160 1726853557.82686: in run() - task 02083763-bbaf-5676-4eb4-000000000630 24160 1726853557.82698: variable 'ansible_search_path' from source: unknown 24160 1726853557.82702: variable 'ansible_search_path' from source: unknown 24160 1726853557.82729: calling self._execute() 24160 1726853557.82814: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853557.82817: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853557.82826: variable 'omit' from source: magic vars 24160 1726853557.83094: variable 'ansible_distribution_major_version' from source: facts 24160 1726853557.83105: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853557.83187: variable 'profile_stat' from source: set_fact 24160 1726853557.83201: Evaluated conditional (profile_stat.stat.exists): False 24160 1726853557.83205: when evaluation is False, skipping this task 24160 1726853557.83208: _execute() done 24160 1726853557.83210: dumping result to json 24160 1726853557.83212: done dumping result, returning 24160 1726853557.83215: done running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-ethtest0 [02083763-bbaf-5676-4eb4-000000000630] 24160 1726853557.83220: sending task result for task 02083763-bbaf-5676-4eb4-000000000630 24160 1726853557.83303: done sending task result for task 02083763-bbaf-5676-4eb4-000000000630 24160 1726853557.83306: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 24160 1726853557.83356: no more pending results, returning what we have 24160 1726853557.83359: results queue empty 24160 1726853557.83360: checking for any_errors_fatal 24160 1726853557.83367: done checking for any_errors_fatal 24160 1726853557.83367: checking for max_fail_percentage 24160 1726853557.83369: done checking for max_fail_percentage 24160 1726853557.83369: checking to see if all hosts have failed and the running result is not ok 24160 1726853557.83370: done checking to see if all hosts have failed 24160 1726853557.83372: getting the remaining hosts for this loop 24160 1726853557.83374: done getting the remaining hosts for this loop 24160 1726853557.83377: getting the next task for host managed_node1 24160 1726853557.83383: done getting next task for host managed_node1 24160 1726853557.83385: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 24160 1726853557.83388: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853557.83392: getting variables 24160 1726853557.83393: in VariableManager get_vars() 24160 1726853557.83415: Calling all_inventory to load vars for managed_node1 24160 1726853557.83417: Calling groups_inventory to load vars for managed_node1 24160 1726853557.83420: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853557.83429: Calling all_plugins_play to load vars for managed_node1 24160 1726853557.83433: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853557.83435: Calling groups_plugins_play to load vars for managed_node1 24160 1726853557.84321: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853557.85189: done with get_vars() 24160 1726853557.85205: done getting variables 24160 1726853557.85247: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 24160 1726853557.85328: variable 'profile' from source: include params 24160 1726853557.85331: variable 'interface' from source: set_fact 24160 1726853557.85373: variable 'interface' from source: set_fact TASK [Verify the ansible_managed comment in ifcfg-ethtest0] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 13:32:37 -0400 (0:00:00.031) 0:00:34.256 ****** 24160 1726853557.85396: entering _queue_task() for managed_node1/set_fact 24160 1726853557.85669: worker is 1 (out of 1 available) 24160 1726853557.85685: exiting _queue_task() for managed_node1/set_fact 24160 1726853557.85697: done queuing things up, now waiting for results queue to drain 24160 1726853557.85698: waiting for pending results... 24160 1726853557.85879: running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-ethtest0 24160 1726853557.85967: in run() - task 02083763-bbaf-5676-4eb4-000000000631 24160 1726853557.85980: variable 'ansible_search_path' from source: unknown 24160 1726853557.85983: variable 'ansible_search_path' from source: unknown 24160 1726853557.86016: calling self._execute() 24160 1726853557.86100: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853557.86103: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853557.86114: variable 'omit' from source: magic vars 24160 1726853557.86396: variable 'ansible_distribution_major_version' from source: facts 24160 1726853557.86406: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853557.86493: variable 'profile_stat' from source: set_fact 24160 1726853557.86503: Evaluated conditional (profile_stat.stat.exists): False 24160 1726853557.86505: when evaluation is False, skipping this task 24160 1726853557.86508: _execute() done 24160 1726853557.86511: dumping result to json 24160 1726853557.86514: done dumping result, returning 24160 1726853557.86520: done running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-ethtest0 [02083763-bbaf-5676-4eb4-000000000631] 24160 1726853557.86525: sending task result for task 02083763-bbaf-5676-4eb4-000000000631 24160 1726853557.86608: done sending task result for task 02083763-bbaf-5676-4eb4-000000000631 24160 1726853557.86611: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 24160 1726853557.86657: no more pending results, returning what we have 24160 1726853557.86661: results queue empty 24160 1726853557.86662: checking for any_errors_fatal 24160 1726853557.86669: done checking for any_errors_fatal 24160 1726853557.86670: checking for max_fail_percentage 24160 1726853557.86673: done checking for max_fail_percentage 24160 1726853557.86674: checking to see if all hosts have failed and the running result is not ok 24160 1726853557.86675: done checking to see if all hosts have failed 24160 1726853557.86675: getting the remaining hosts for this loop 24160 1726853557.86677: done getting the remaining hosts for this loop 24160 1726853557.86680: getting the next task for host managed_node1 24160 1726853557.86688: done getting next task for host managed_node1 24160 1726853557.86690: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 24160 1726853557.86694: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853557.86698: getting variables 24160 1726853557.86700: in VariableManager get_vars() 24160 1726853557.86727: Calling all_inventory to load vars for managed_node1 24160 1726853557.86729: Calling groups_inventory to load vars for managed_node1 24160 1726853557.86733: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853557.86743: Calling all_plugins_play to load vars for managed_node1 24160 1726853557.86745: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853557.86748: Calling groups_plugins_play to load vars for managed_node1 24160 1726853557.87594: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853557.89015: done with get_vars() 24160 1726853557.89036: done getting variables 24160 1726853557.89085: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 24160 1726853557.89168: variable 'profile' from source: include params 24160 1726853557.89173: variable 'interface' from source: set_fact 24160 1726853557.89214: variable 'interface' from source: set_fact TASK [Get the fingerprint comment in ifcfg-ethtest0] *************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 13:32:37 -0400 (0:00:00.038) 0:00:34.295 ****** 24160 1726853557.89237: entering _queue_task() for managed_node1/command 24160 1726853557.89487: worker is 1 (out of 1 available) 24160 1726853557.89501: exiting _queue_task() for managed_node1/command 24160 1726853557.89513: done queuing things up, now waiting for results queue to drain 24160 1726853557.89515: waiting for pending results... 24160 1726853557.89703: running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-ethtest0 24160 1726853557.89781: in run() - task 02083763-bbaf-5676-4eb4-000000000632 24160 1726853557.89794: variable 'ansible_search_path' from source: unknown 24160 1726853557.89797: variable 'ansible_search_path' from source: unknown 24160 1726853557.89826: calling self._execute() 24160 1726853557.89909: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853557.89913: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853557.89921: variable 'omit' from source: magic vars 24160 1726853557.90188: variable 'ansible_distribution_major_version' from source: facts 24160 1726853557.90199: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853557.90285: variable 'profile_stat' from source: set_fact 24160 1726853557.90297: Evaluated conditional (profile_stat.stat.exists): False 24160 1726853557.90301: when evaluation is False, skipping this task 24160 1726853557.90303: _execute() done 24160 1726853557.90306: dumping result to json 24160 1726853557.90308: done dumping result, returning 24160 1726853557.90313: done running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-ethtest0 [02083763-bbaf-5676-4eb4-000000000632] 24160 1726853557.90318: sending task result for task 02083763-bbaf-5676-4eb4-000000000632 24160 1726853557.90399: done sending task result for task 02083763-bbaf-5676-4eb4-000000000632 24160 1726853557.90401: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 24160 1726853557.90452: no more pending results, returning what we have 24160 1726853557.90458: results queue empty 24160 1726853557.90460: checking for any_errors_fatal 24160 1726853557.90466: done checking for any_errors_fatal 24160 1726853557.90466: checking for max_fail_percentage 24160 1726853557.90468: done checking for max_fail_percentage 24160 1726853557.90469: checking to see if all hosts have failed and the running result is not ok 24160 1726853557.90470: done checking to see if all hosts have failed 24160 1726853557.90472: getting the remaining hosts for this loop 24160 1726853557.90473: done getting the remaining hosts for this loop 24160 1726853557.90477: getting the next task for host managed_node1 24160 1726853557.90484: done getting next task for host managed_node1 24160 1726853557.90486: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 24160 1726853557.90490: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853557.90494: getting variables 24160 1726853557.90495: in VariableManager get_vars() 24160 1726853557.90526: Calling all_inventory to load vars for managed_node1 24160 1726853557.90528: Calling groups_inventory to load vars for managed_node1 24160 1726853557.90532: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853557.90541: Calling all_plugins_play to load vars for managed_node1 24160 1726853557.90543: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853557.90545: Calling groups_plugins_play to load vars for managed_node1 24160 1726853557.92379: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853557.93972: done with get_vars() 24160 1726853557.93997: done getting variables 24160 1726853557.94054: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 24160 1726853557.94165: variable 'profile' from source: include params 24160 1726853557.94169: variable 'interface' from source: set_fact 24160 1726853557.94227: variable 'interface' from source: set_fact TASK [Verify the fingerprint comment in ifcfg-ethtest0] ************************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 13:32:37 -0400 (0:00:00.050) 0:00:34.345 ****** 24160 1726853557.94259: entering _queue_task() for managed_node1/set_fact 24160 1726853557.94589: worker is 1 (out of 1 available) 24160 1726853557.94601: exiting _queue_task() for managed_node1/set_fact 24160 1726853557.94614: done queuing things up, now waiting for results queue to drain 24160 1726853557.94615: waiting for pending results... 24160 1726853557.94956: running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-ethtest0 24160 1726853557.95000: in run() - task 02083763-bbaf-5676-4eb4-000000000633 24160 1726853557.95023: variable 'ansible_search_path' from source: unknown 24160 1726853557.95033: variable 'ansible_search_path' from source: unknown 24160 1726853557.95082: calling self._execute() 24160 1726853557.95209: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853557.95222: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853557.95276: variable 'omit' from source: magic vars 24160 1726853557.95629: variable 'ansible_distribution_major_version' from source: facts 24160 1726853557.95655: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853557.95781: variable 'profile_stat' from source: set_fact 24160 1726853557.95800: Evaluated conditional (profile_stat.stat.exists): False 24160 1726853557.95808: when evaluation is False, skipping this task 24160 1726853557.95816: _execute() done 24160 1726853557.95857: dumping result to json 24160 1726853557.95861: done dumping result, returning 24160 1726853557.95864: done running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-ethtest0 [02083763-bbaf-5676-4eb4-000000000633] 24160 1726853557.95867: sending task result for task 02083763-bbaf-5676-4eb4-000000000633 skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 24160 1726853557.96214: no more pending results, returning what we have 24160 1726853557.96217: results queue empty 24160 1726853557.96218: checking for any_errors_fatal 24160 1726853557.96224: done checking for any_errors_fatal 24160 1726853557.96225: checking for max_fail_percentage 24160 1726853557.96226: done checking for max_fail_percentage 24160 1726853557.96227: checking to see if all hosts have failed and the running result is not ok 24160 1726853557.96228: done checking to see if all hosts have failed 24160 1726853557.96228: getting the remaining hosts for this loop 24160 1726853557.96230: done getting the remaining hosts for this loop 24160 1726853557.96234: getting the next task for host managed_node1 24160 1726853557.96242: done getting next task for host managed_node1 24160 1726853557.96245: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 24160 1726853557.96249: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853557.96253: getting variables 24160 1726853557.96254: in VariableManager get_vars() 24160 1726853557.96282: Calling all_inventory to load vars for managed_node1 24160 1726853557.96285: Calling groups_inventory to load vars for managed_node1 24160 1726853557.96288: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853557.96298: Calling all_plugins_play to load vars for managed_node1 24160 1726853557.96301: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853557.96304: Calling groups_plugins_play to load vars for managed_node1 24160 1726853557.96884: done sending task result for task 02083763-bbaf-5676-4eb4-000000000633 24160 1726853557.96888: WORKER PROCESS EXITING 24160 1726853557.97689: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853557.99396: done with get_vars() 24160 1726853557.99419: done getting variables 24160 1726853557.99482: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 24160 1726853557.99596: variable 'profile' from source: include params 24160 1726853557.99599: variable 'interface' from source: set_fact 24160 1726853557.99656: variable 'interface' from source: set_fact TASK [Assert that the profile is absent - 'ethtest0'] ************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Friday 20 September 2024 13:32:37 -0400 (0:00:00.054) 0:00:34.399 ****** 24160 1726853557.99689: entering _queue_task() for managed_node1/assert 24160 1726853558.00040: worker is 1 (out of 1 available) 24160 1726853558.00053: exiting _queue_task() for managed_node1/assert 24160 1726853558.00066: done queuing things up, now waiting for results queue to drain 24160 1726853558.00067: waiting for pending results... 24160 1726853558.00348: running TaskExecutor() for managed_node1/TASK: Assert that the profile is absent - 'ethtest0' 24160 1726853558.00458: in run() - task 02083763-bbaf-5676-4eb4-000000000613 24160 1726853558.00482: variable 'ansible_search_path' from source: unknown 24160 1726853558.00494: variable 'ansible_search_path' from source: unknown 24160 1726853558.00537: calling self._execute() 24160 1726853558.00653: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853558.00666: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853558.00683: variable 'omit' from source: magic vars 24160 1726853558.01060: variable 'ansible_distribution_major_version' from source: facts 24160 1726853558.01082: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853558.01096: variable 'omit' from source: magic vars 24160 1726853558.01138: variable 'omit' from source: magic vars 24160 1726853558.01244: variable 'profile' from source: include params 24160 1726853558.01260: variable 'interface' from source: set_fact 24160 1726853558.01325: variable 'interface' from source: set_fact 24160 1726853558.01349: variable 'omit' from source: magic vars 24160 1726853558.01401: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24160 1726853558.01441: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24160 1726853558.01468: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24160 1726853558.01494: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853558.01511: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853558.01547: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24160 1726853558.01557: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853558.01584: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853558.01675: Set connection var ansible_shell_executable to /bin/sh 24160 1726853558.01876: Set connection var ansible_pipelining to False 24160 1726853558.01878: Set connection var ansible_connection to ssh 24160 1726853558.01880: Set connection var ansible_shell_type to sh 24160 1726853558.01882: Set connection var ansible_module_compression to ZIP_DEFLATED 24160 1726853558.01884: Set connection var ansible_timeout to 10 24160 1726853558.01886: variable 'ansible_shell_executable' from source: unknown 24160 1726853558.01888: variable 'ansible_connection' from source: unknown 24160 1726853558.01890: variable 'ansible_module_compression' from source: unknown 24160 1726853558.01891: variable 'ansible_shell_type' from source: unknown 24160 1726853558.01893: variable 'ansible_shell_executable' from source: unknown 24160 1726853558.01894: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853558.01896: variable 'ansible_pipelining' from source: unknown 24160 1726853558.01898: variable 'ansible_timeout' from source: unknown 24160 1726853558.01899: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853558.01915: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 24160 1726853558.01929: variable 'omit' from source: magic vars 24160 1726853558.01937: starting attempt loop 24160 1726853558.01942: running the handler 24160 1726853558.02068: variable 'lsr_net_profile_exists' from source: set_fact 24160 1726853558.02081: Evaluated conditional (not lsr_net_profile_exists): True 24160 1726853558.02094: handler run complete 24160 1726853558.02112: attempt loop complete, returning result 24160 1726853558.02124: _execute() done 24160 1726853558.02131: dumping result to json 24160 1726853558.02139: done dumping result, returning 24160 1726853558.02151: done running TaskExecutor() for managed_node1/TASK: Assert that the profile is absent - 'ethtest0' [02083763-bbaf-5676-4eb4-000000000613] 24160 1726853558.02159: sending task result for task 02083763-bbaf-5676-4eb4-000000000613 24160 1726853558.02377: done sending task result for task 02083763-bbaf-5676-4eb4-000000000613 24160 1726853558.02380: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 24160 1726853558.02428: no more pending results, returning what we have 24160 1726853558.02431: results queue empty 24160 1726853558.02432: checking for any_errors_fatal 24160 1726853558.02439: done checking for any_errors_fatal 24160 1726853558.02440: checking for max_fail_percentage 24160 1726853558.02442: done checking for max_fail_percentage 24160 1726853558.02443: checking to see if all hosts have failed and the running result is not ok 24160 1726853558.02443: done checking to see if all hosts have failed 24160 1726853558.02444: getting the remaining hosts for this loop 24160 1726853558.02446: done getting the remaining hosts for this loop 24160 1726853558.02449: getting the next task for host managed_node1 24160 1726853558.02456: done getting next task for host managed_node1 24160 1726853558.02459: ^ task is: TASK: Include the task 'assert_device_absent.yml' 24160 1726853558.02461: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853558.02465: getting variables 24160 1726853558.02467: in VariableManager get_vars() 24160 1726853558.02500: Calling all_inventory to load vars for managed_node1 24160 1726853558.02502: Calling groups_inventory to load vars for managed_node1 24160 1726853558.02506: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853558.02519: Calling all_plugins_play to load vars for managed_node1 24160 1726853558.02522: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853558.02524: Calling groups_plugins_play to load vars for managed_node1 24160 1726853558.04017: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853558.05764: done with get_vars() 24160 1726853558.05789: done getting variables TASK [Include the task 'assert_device_absent.yml'] ***************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:89 Friday 20 September 2024 13:32:38 -0400 (0:00:00.061) 0:00:34.461 ****** 24160 1726853558.05886: entering _queue_task() for managed_node1/include_tasks 24160 1726853558.06212: worker is 1 (out of 1 available) 24160 1726853558.06228: exiting _queue_task() for managed_node1/include_tasks 24160 1726853558.06243: done queuing things up, now waiting for results queue to drain 24160 1726853558.06245: waiting for pending results... 24160 1726853558.06555: running TaskExecutor() for managed_node1/TASK: Include the task 'assert_device_absent.yml' 24160 1726853558.06695: in run() - task 02083763-bbaf-5676-4eb4-00000000009e 24160 1726853558.06699: variable 'ansible_search_path' from source: unknown 24160 1726853558.06741: calling self._execute() 24160 1726853558.06981: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853558.06985: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853558.06987: variable 'omit' from source: magic vars 24160 1726853558.07321: variable 'ansible_distribution_major_version' from source: facts 24160 1726853558.07343: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853558.07356: _execute() done 24160 1726853558.07366: dumping result to json 24160 1726853558.07439: done dumping result, returning 24160 1726853558.07443: done running TaskExecutor() for managed_node1/TASK: Include the task 'assert_device_absent.yml' [02083763-bbaf-5676-4eb4-00000000009e] 24160 1726853558.07445: sending task result for task 02083763-bbaf-5676-4eb4-00000000009e 24160 1726853558.07520: done sending task result for task 02083763-bbaf-5676-4eb4-00000000009e 24160 1726853558.07524: WORKER PROCESS EXITING 24160 1726853558.07572: no more pending results, returning what we have 24160 1726853558.07578: in VariableManager get_vars() 24160 1726853558.07615: Calling all_inventory to load vars for managed_node1 24160 1726853558.07618: Calling groups_inventory to load vars for managed_node1 24160 1726853558.07621: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853558.07635: Calling all_plugins_play to load vars for managed_node1 24160 1726853558.07638: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853558.07640: Calling groups_plugins_play to load vars for managed_node1 24160 1726853558.09359: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853558.10898: done with get_vars() 24160 1726853558.10918: variable 'ansible_search_path' from source: unknown 24160 1726853558.10933: we have included files to process 24160 1726853558.10934: generating all_blocks data 24160 1726853558.10936: done generating all_blocks data 24160 1726853558.10943: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 24160 1726853558.10944: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 24160 1726853558.10946: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 24160 1726853558.11097: in VariableManager get_vars() 24160 1726853558.11112: done with get_vars() 24160 1726853558.11213: done processing included file 24160 1726853558.11216: iterating over new_blocks loaded from include file 24160 1726853558.11217: in VariableManager get_vars() 24160 1726853558.11228: done with get_vars() 24160 1726853558.11229: filtering new block on tags 24160 1726853558.11245: done filtering new block on tags 24160 1726853558.11247: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed_node1 24160 1726853558.11252: extending task lists for all hosts with included blocks 24160 1726853558.11465: done extending task lists 24160 1726853558.11467: done processing included files 24160 1726853558.11468: results queue empty 24160 1726853558.11468: checking for any_errors_fatal 24160 1726853558.11473: done checking for any_errors_fatal 24160 1726853558.11474: checking for max_fail_percentage 24160 1726853558.11475: done checking for max_fail_percentage 24160 1726853558.11475: checking to see if all hosts have failed and the running result is not ok 24160 1726853558.11476: done checking to see if all hosts have failed 24160 1726853558.11477: getting the remaining hosts for this loop 24160 1726853558.11478: done getting the remaining hosts for this loop 24160 1726853558.11480: getting the next task for host managed_node1 24160 1726853558.11483: done getting next task for host managed_node1 24160 1726853558.11485: ^ task is: TASK: Include the task 'get_interface_stat.yml' 24160 1726853558.11488: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853558.11490: getting variables 24160 1726853558.11491: in VariableManager get_vars() 24160 1726853558.11498: Calling all_inventory to load vars for managed_node1 24160 1726853558.11500: Calling groups_inventory to load vars for managed_node1 24160 1726853558.11502: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853558.11508: Calling all_plugins_play to load vars for managed_node1 24160 1726853558.11511: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853558.11513: Calling groups_plugins_play to load vars for managed_node1 24160 1726853558.12640: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853558.14445: done with get_vars() 24160 1726853558.14470: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Friday 20 September 2024 13:32:38 -0400 (0:00:00.089) 0:00:34.551 ****** 24160 1726853558.14839: entering _queue_task() for managed_node1/include_tasks 24160 1726853558.15292: worker is 1 (out of 1 available) 24160 1726853558.15305: exiting _queue_task() for managed_node1/include_tasks 24160 1726853558.15317: done queuing things up, now waiting for results queue to drain 24160 1726853558.15318: waiting for pending results... 24160 1726853558.15673: running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' 24160 1726853558.15680: in run() - task 02083763-bbaf-5676-4eb4-000000000664 24160 1726853558.15701: variable 'ansible_search_path' from source: unknown 24160 1726853558.15709: variable 'ansible_search_path' from source: unknown 24160 1726853558.15748: calling self._execute() 24160 1726853558.15862: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853558.15882: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853558.15896: variable 'omit' from source: magic vars 24160 1726853558.16278: variable 'ansible_distribution_major_version' from source: facts 24160 1726853558.16296: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853558.16315: _execute() done 24160 1726853558.16322: dumping result to json 24160 1726853558.16329: done dumping result, returning 24160 1726853558.16339: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' [02083763-bbaf-5676-4eb4-000000000664] 24160 1726853558.16348: sending task result for task 02083763-bbaf-5676-4eb4-000000000664 24160 1726853558.16475: done sending task result for task 02083763-bbaf-5676-4eb4-000000000664 24160 1726853558.16479: WORKER PROCESS EXITING 24160 1726853558.16545: no more pending results, returning what we have 24160 1726853558.16550: in VariableManager get_vars() 24160 1726853558.16587: Calling all_inventory to load vars for managed_node1 24160 1726853558.16590: Calling groups_inventory to load vars for managed_node1 24160 1726853558.16593: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853558.16607: Calling all_plugins_play to load vars for managed_node1 24160 1726853558.16610: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853558.16612: Calling groups_plugins_play to load vars for managed_node1 24160 1726853558.18328: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853558.19956: done with get_vars() 24160 1726853558.20085: variable 'ansible_search_path' from source: unknown 24160 1726853558.20087: variable 'ansible_search_path' from source: unknown 24160 1726853558.20123: we have included files to process 24160 1726853558.20125: generating all_blocks data 24160 1726853558.20126: done generating all_blocks data 24160 1726853558.20127: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 24160 1726853558.20128: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 24160 1726853558.20130: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 24160 1726853558.20523: done processing included file 24160 1726853558.20525: iterating over new_blocks loaded from include file 24160 1726853558.20526: in VariableManager get_vars() 24160 1726853558.20539: done with get_vars() 24160 1726853558.20540: filtering new block on tags 24160 1726853558.20555: done filtering new block on tags 24160 1726853558.20558: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node1 24160 1726853558.20562: extending task lists for all hosts with included blocks 24160 1726853558.20793: done extending task lists 24160 1726853558.20794: done processing included files 24160 1726853558.20795: results queue empty 24160 1726853558.20796: checking for any_errors_fatal 24160 1726853558.20799: done checking for any_errors_fatal 24160 1726853558.20800: checking for max_fail_percentage 24160 1726853558.20801: done checking for max_fail_percentage 24160 1726853558.20801: checking to see if all hosts have failed and the running result is not ok 24160 1726853558.20802: done checking to see if all hosts have failed 24160 1726853558.20803: getting the remaining hosts for this loop 24160 1726853558.20804: done getting the remaining hosts for this loop 24160 1726853558.20807: getting the next task for host managed_node1 24160 1726853558.20811: done getting next task for host managed_node1 24160 1726853558.20937: ^ task is: TASK: Get stat for interface {{ interface }} 24160 1726853558.20941: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853558.20944: getting variables 24160 1726853558.20945: in VariableManager get_vars() 24160 1726853558.20954: Calling all_inventory to load vars for managed_node1 24160 1726853558.20956: Calling groups_inventory to load vars for managed_node1 24160 1726853558.20959: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853558.20964: Calling all_plugins_play to load vars for managed_node1 24160 1726853558.20966: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853558.20969: Calling groups_plugins_play to load vars for managed_node1 24160 1726853558.23581: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853558.26259: done with get_vars() 24160 1726853558.26287: done getting variables 24160 1726853558.26451: variable 'interface' from source: set_fact TASK [Get stat for interface ethtest0] ***************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 13:32:38 -0400 (0:00:00.116) 0:00:34.667 ****** 24160 1726853558.26492: entering _queue_task() for managed_node1/stat 24160 1726853558.26998: worker is 1 (out of 1 available) 24160 1726853558.27010: exiting _queue_task() for managed_node1/stat 24160 1726853558.27022: done queuing things up, now waiting for results queue to drain 24160 1726853558.27023: waiting for pending results... 24160 1726853558.27590: running TaskExecutor() for managed_node1/TASK: Get stat for interface ethtest0 24160 1726853558.27600: in run() - task 02083763-bbaf-5676-4eb4-000000000687 24160 1726853558.27603: variable 'ansible_search_path' from source: unknown 24160 1726853558.27606: variable 'ansible_search_path' from source: unknown 24160 1726853558.27609: calling self._execute() 24160 1726853558.27612: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853558.27615: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853558.27617: variable 'omit' from source: magic vars 24160 1726853558.27883: variable 'ansible_distribution_major_version' from source: facts 24160 1726853558.27906: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853558.27921: variable 'omit' from source: magic vars 24160 1726853558.27978: variable 'omit' from source: magic vars 24160 1726853558.28090: variable 'interface' from source: set_fact 24160 1726853558.28121: variable 'omit' from source: magic vars 24160 1726853558.28175: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24160 1726853558.28214: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24160 1726853558.28245: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24160 1726853558.28278: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853558.28295: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853558.28328: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24160 1726853558.28343: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853558.28351: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853558.28469: Set connection var ansible_shell_executable to /bin/sh 24160 1726853558.28486: Set connection var ansible_pipelining to False 24160 1726853558.28559: Set connection var ansible_connection to ssh 24160 1726853558.28562: Set connection var ansible_shell_type to sh 24160 1726853558.28564: Set connection var ansible_module_compression to ZIP_DEFLATED 24160 1726853558.28566: Set connection var ansible_timeout to 10 24160 1726853558.28569: variable 'ansible_shell_executable' from source: unknown 24160 1726853558.28572: variable 'ansible_connection' from source: unknown 24160 1726853558.28575: variable 'ansible_module_compression' from source: unknown 24160 1726853558.28577: variable 'ansible_shell_type' from source: unknown 24160 1726853558.28579: variable 'ansible_shell_executable' from source: unknown 24160 1726853558.28585: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853558.28587: variable 'ansible_pipelining' from source: unknown 24160 1726853558.28590: variable 'ansible_timeout' from source: unknown 24160 1726853558.28598: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853558.28814: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 24160 1726853558.28831: variable 'omit' from source: magic vars 24160 1726853558.28842: starting attempt loop 24160 1726853558.28850: running the handler 24160 1726853558.28874: _low_level_execute_command(): starting 24160 1726853558.28912: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24160 1726853558.29621: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853558.29638: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853558.29695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853558.29700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24160 1726853558.29786: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853558.29809: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853558.29890: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853558.31569: stdout chunk (state=3): >>>/root <<< 24160 1726853558.31982: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853558.31986: stdout chunk (state=3): >>><<< 24160 1726853558.31988: stderr chunk (state=3): >>><<< 24160 1726853558.31991: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853558.31994: _low_level_execute_command(): starting 24160 1726853558.31997: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853558.3189147-25797-124126052142537 `" && echo ansible-tmp-1726853558.3189147-25797-124126052142537="` echo /root/.ansible/tmp/ansible-tmp-1726853558.3189147-25797-124126052142537 `" ) && sleep 0' 24160 1726853558.33059: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853558.33146: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853558.33280: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853558.33306: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853558.33381: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853558.35270: stdout chunk (state=3): >>>ansible-tmp-1726853558.3189147-25797-124126052142537=/root/.ansible/tmp/ansible-tmp-1726853558.3189147-25797-124126052142537 <<< 24160 1726853558.35413: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853558.35423: stdout chunk (state=3): >>><<< 24160 1726853558.35435: stderr chunk (state=3): >>><<< 24160 1726853558.35459: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853558.3189147-25797-124126052142537=/root/.ansible/tmp/ansible-tmp-1726853558.3189147-25797-124126052142537 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853558.35577: variable 'ansible_module_compression' from source: unknown 24160 1726853558.35751: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24160jdl187cr/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 24160 1726853558.35804: variable 'ansible_facts' from source: unknown 24160 1726853558.35976: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853558.3189147-25797-124126052142537/AnsiballZ_stat.py 24160 1726853558.36494: Sending initial data 24160 1726853558.36497: Sent initial data (153 bytes) 24160 1726853558.37381: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853558.37384: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853558.37458: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24160 1726853558.37462: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853558.37485: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853558.37490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853558.37687: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853558.37711: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853558.37791: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853558.39310: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24160 1726853558.39368: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24160 1726853558.39423: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24160jdl187cr/tmpy2o9oorl /root/.ansible/tmp/ansible-tmp-1726853558.3189147-25797-124126052142537/AnsiballZ_stat.py <<< 24160 1726853558.39427: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853558.3189147-25797-124126052142537/AnsiballZ_stat.py" <<< 24160 1726853558.39628: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24160jdl187cr/tmpy2o9oorl" to remote "/root/.ansible/tmp/ansible-tmp-1726853558.3189147-25797-124126052142537/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853558.3189147-25797-124126052142537/AnsiballZ_stat.py" <<< 24160 1726853558.40897: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853558.40918: stderr chunk (state=3): >>><<< 24160 1726853558.40922: stdout chunk (state=3): >>><<< 24160 1726853558.40949: done transferring module to remote 24160 1726853558.40957: _low_level_execute_command(): starting 24160 1726853558.40960: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853558.3189147-25797-124126052142537/ /root/.ansible/tmp/ansible-tmp-1726853558.3189147-25797-124126052142537/AnsiballZ_stat.py && sleep 0' 24160 1726853558.42390: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853558.42394: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853558.42397: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853558.42399: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853558.42423: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853558.42465: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853558.44286: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853558.44290: stdout chunk (state=3): >>><<< 24160 1726853558.44296: stderr chunk (state=3): >>><<< 24160 1726853558.44312: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853558.44316: _low_level_execute_command(): starting 24160 1726853558.44320: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853558.3189147-25797-124126052142537/AnsiballZ_stat.py && sleep 0' 24160 1726853558.45474: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853558.45544: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853558.45585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853558.45605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853558.45764: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853558.45789: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853558.45829: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853558.45847: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853558.45987: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853558.61460: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 24160 1726853558.63115: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 24160 1726853558.63119: stdout chunk (state=3): >>><<< 24160 1726853558.63134: stderr chunk (state=3): >>><<< 24160 1726853558.63258: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 24160 1726853558.63345: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853558.3189147-25797-124126052142537/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24160 1726853558.63533: _low_level_execute_command(): starting 24160 1726853558.63537: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853558.3189147-25797-124126052142537/ > /dev/null 2>&1 && sleep 0' 24160 1726853558.64638: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853558.64689: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853558.64701: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853558.64714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24160 1726853558.64727: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 24160 1726853558.64864: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853558.64896: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853558.65038: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853558.66946: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853558.66956: stdout chunk (state=3): >>><<< 24160 1726853558.66959: stderr chunk (state=3): >>><<< 24160 1726853558.66972: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853558.66979: handler run complete 24160 1726853558.67000: attempt loop complete, returning result 24160 1726853558.67003: _execute() done 24160 1726853558.67006: dumping result to json 24160 1726853558.67008: done dumping result, returning 24160 1726853558.67018: done running TaskExecutor() for managed_node1/TASK: Get stat for interface ethtest0 [02083763-bbaf-5676-4eb4-000000000687] 24160 1726853558.67020: sending task result for task 02083763-bbaf-5676-4eb4-000000000687 24160 1726853558.67201: done sending task result for task 02083763-bbaf-5676-4eb4-000000000687 ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 24160 1726853558.67325: no more pending results, returning what we have 24160 1726853558.67328: results queue empty 24160 1726853558.67329: checking for any_errors_fatal 24160 1726853558.67331: done checking for any_errors_fatal 24160 1726853558.67331: checking for max_fail_percentage 24160 1726853558.67333: done checking for max_fail_percentage 24160 1726853558.67333: checking to see if all hosts have failed and the running result is not ok 24160 1726853558.67334: done checking to see if all hosts have failed 24160 1726853558.67335: getting the remaining hosts for this loop 24160 1726853558.67336: done getting the remaining hosts for this loop 24160 1726853558.67340: getting the next task for host managed_node1 24160 1726853558.67347: done getting next task for host managed_node1 24160 1726853558.67349: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 24160 1726853558.67352: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853558.67358: getting variables 24160 1726853558.67360: in VariableManager get_vars() 24160 1726853558.67387: Calling all_inventory to load vars for managed_node1 24160 1726853558.67389: Calling groups_inventory to load vars for managed_node1 24160 1726853558.67392: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853558.67403: Calling all_plugins_play to load vars for managed_node1 24160 1726853558.67405: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853558.67407: Calling groups_plugins_play to load vars for managed_node1 24160 1726853558.67983: WORKER PROCESS EXITING 24160 1726853558.69197: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853558.71519: done with get_vars() 24160 1726853558.71552: done getting variables 24160 1726853558.71617: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 24160 1726853558.71736: variable 'interface' from source: set_fact TASK [Assert that the interface is absent - 'ethtest0'] ************************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Friday 20 September 2024 13:32:38 -0400 (0:00:00.452) 0:00:35.120 ****** 24160 1726853558.71768: entering _queue_task() for managed_node1/assert 24160 1726853558.72127: worker is 1 (out of 1 available) 24160 1726853558.72141: exiting _queue_task() for managed_node1/assert 24160 1726853558.72153: done queuing things up, now waiting for results queue to drain 24160 1726853558.72155: waiting for pending results... 24160 1726853558.72428: running TaskExecutor() for managed_node1/TASK: Assert that the interface is absent - 'ethtest0' 24160 1726853558.72533: in run() - task 02083763-bbaf-5676-4eb4-000000000665 24160 1726853558.72551: variable 'ansible_search_path' from source: unknown 24160 1726853558.72557: variable 'ansible_search_path' from source: unknown 24160 1726853558.72816: calling self._execute() 24160 1726853558.72990: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853558.72994: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853558.72997: variable 'omit' from source: magic vars 24160 1726853558.73348: variable 'ansible_distribution_major_version' from source: facts 24160 1726853558.73368: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853558.73383: variable 'omit' from source: magic vars 24160 1726853558.73426: variable 'omit' from source: magic vars 24160 1726853558.73529: variable 'interface' from source: set_fact 24160 1726853558.73551: variable 'omit' from source: magic vars 24160 1726853558.73599: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24160 1726853558.73639: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24160 1726853558.73670: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24160 1726853558.73776: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853558.73779: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853558.73781: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24160 1726853558.73784: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853558.73786: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853558.73856: Set connection var ansible_shell_executable to /bin/sh 24160 1726853558.73868: Set connection var ansible_pipelining to False 24160 1726853558.73878: Set connection var ansible_connection to ssh 24160 1726853558.73884: Set connection var ansible_shell_type to sh 24160 1726853558.73896: Set connection var ansible_module_compression to ZIP_DEFLATED 24160 1726853558.73913: Set connection var ansible_timeout to 10 24160 1726853558.74014: variable 'ansible_shell_executable' from source: unknown 24160 1726853558.74019: variable 'ansible_connection' from source: unknown 24160 1726853558.74029: variable 'ansible_module_compression' from source: unknown 24160 1726853558.74052: variable 'ansible_shell_type' from source: unknown 24160 1726853558.74078: variable 'ansible_shell_executable' from source: unknown 24160 1726853558.74082: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853558.74084: variable 'ansible_pipelining' from source: unknown 24160 1726853558.74086: variable 'ansible_timeout' from source: unknown 24160 1726853558.74088: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853558.74393: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 24160 1726853558.74396: variable 'omit' from source: magic vars 24160 1726853558.74399: starting attempt loop 24160 1726853558.74401: running the handler 24160 1726853558.74481: variable 'interface_stat' from source: set_fact 24160 1726853558.74500: Evaluated conditional (not interface_stat.stat.exists): True 24160 1726853558.74541: handler run complete 24160 1726853558.74577: attempt loop complete, returning result 24160 1726853558.74584: _execute() done 24160 1726853558.74592: dumping result to json 24160 1726853558.74603: done dumping result, returning 24160 1726853558.74614: done running TaskExecutor() for managed_node1/TASK: Assert that the interface is absent - 'ethtest0' [02083763-bbaf-5676-4eb4-000000000665] 24160 1726853558.74622: sending task result for task 02083763-bbaf-5676-4eb4-000000000665 ok: [managed_node1] => { "changed": false } MSG: All assertions passed 24160 1726853558.74832: no more pending results, returning what we have 24160 1726853558.74836: results queue empty 24160 1726853558.74837: checking for any_errors_fatal 24160 1726853558.74849: done checking for any_errors_fatal 24160 1726853558.74850: checking for max_fail_percentage 24160 1726853558.74852: done checking for max_fail_percentage 24160 1726853558.74853: checking to see if all hosts have failed and the running result is not ok 24160 1726853558.74854: done checking to see if all hosts have failed 24160 1726853558.74855: getting the remaining hosts for this loop 24160 1726853558.74856: done getting the remaining hosts for this loop 24160 1726853558.74860: getting the next task for host managed_node1 24160 1726853558.74868: done getting next task for host managed_node1 24160 1726853558.74873: ^ task is: TASK: Verify network state restored to default 24160 1726853558.74876: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853558.74880: getting variables 24160 1726853558.74881: in VariableManager get_vars() 24160 1726853558.74908: Calling all_inventory to load vars for managed_node1 24160 1726853558.74911: Calling groups_inventory to load vars for managed_node1 24160 1726853558.74914: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853558.74925: Calling all_plugins_play to load vars for managed_node1 24160 1726853558.74928: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853558.74930: Calling groups_plugins_play to load vars for managed_node1 24160 1726853558.75484: done sending task result for task 02083763-bbaf-5676-4eb4-000000000665 24160 1726853558.75487: WORKER PROCESS EXITING 24160 1726853558.76669: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853558.78265: done with get_vars() 24160 1726853558.78288: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:91 Friday 20 September 2024 13:32:38 -0400 (0:00:00.066) 0:00:35.186 ****** 24160 1726853558.78379: entering _queue_task() for managed_node1/include_tasks 24160 1726853558.78690: worker is 1 (out of 1 available) 24160 1726853558.78701: exiting _queue_task() for managed_node1/include_tasks 24160 1726853558.78712: done queuing things up, now waiting for results queue to drain 24160 1726853558.78713: waiting for pending results... 24160 1726853558.79092: running TaskExecutor() for managed_node1/TASK: Verify network state restored to default 24160 1726853558.79122: in run() - task 02083763-bbaf-5676-4eb4-00000000009f 24160 1726853558.79145: variable 'ansible_search_path' from source: unknown 24160 1726853558.79195: calling self._execute() 24160 1726853558.79307: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853558.79319: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853558.79334: variable 'omit' from source: magic vars 24160 1726853558.79726: variable 'ansible_distribution_major_version' from source: facts 24160 1726853558.79748: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853558.79761: _execute() done 24160 1726853558.79769: dumping result to json 24160 1726853558.79780: done dumping result, returning 24160 1726853558.79791: done running TaskExecutor() for managed_node1/TASK: Verify network state restored to default [02083763-bbaf-5676-4eb4-00000000009f] 24160 1726853558.79801: sending task result for task 02083763-bbaf-5676-4eb4-00000000009f 24160 1726853558.79909: done sending task result for task 02083763-bbaf-5676-4eb4-00000000009f 24160 1726853558.79974: no more pending results, returning what we have 24160 1726853558.79979: in VariableManager get_vars() 24160 1726853558.80014: Calling all_inventory to load vars for managed_node1 24160 1726853558.80017: Calling groups_inventory to load vars for managed_node1 24160 1726853558.80021: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853558.80035: Calling all_plugins_play to load vars for managed_node1 24160 1726853558.80039: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853558.80042: Calling groups_plugins_play to load vars for managed_node1 24160 1726853558.80884: WORKER PROCESS EXITING 24160 1726853558.81624: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853558.83162: done with get_vars() 24160 1726853558.83183: variable 'ansible_search_path' from source: unknown 24160 1726853558.83198: we have included files to process 24160 1726853558.83199: generating all_blocks data 24160 1726853558.83201: done generating all_blocks data 24160 1726853558.83205: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 24160 1726853558.83206: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 24160 1726853558.83209: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 24160 1726853558.83591: done processing included file 24160 1726853558.83593: iterating over new_blocks loaded from include file 24160 1726853558.83595: in VariableManager get_vars() 24160 1726853558.83606: done with get_vars() 24160 1726853558.83607: filtering new block on tags 24160 1726853558.83623: done filtering new block on tags 24160 1726853558.83625: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed_node1 24160 1726853558.83630: extending task lists for all hosts with included blocks 24160 1726853558.83943: done extending task lists 24160 1726853558.83944: done processing included files 24160 1726853558.83945: results queue empty 24160 1726853558.83946: checking for any_errors_fatal 24160 1726853558.83949: done checking for any_errors_fatal 24160 1726853558.83950: checking for max_fail_percentage 24160 1726853558.83951: done checking for max_fail_percentage 24160 1726853558.83952: checking to see if all hosts have failed and the running result is not ok 24160 1726853558.83953: done checking to see if all hosts have failed 24160 1726853558.83953: getting the remaining hosts for this loop 24160 1726853558.83954: done getting the remaining hosts for this loop 24160 1726853558.83957: getting the next task for host managed_node1 24160 1726853558.83961: done getting next task for host managed_node1 24160 1726853558.83963: ^ task is: TASK: Check routes and DNS 24160 1726853558.83965: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853558.83967: getting variables 24160 1726853558.83968: in VariableManager get_vars() 24160 1726853558.83978: Calling all_inventory to load vars for managed_node1 24160 1726853558.83981: Calling groups_inventory to load vars for managed_node1 24160 1726853558.83983: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853558.83988: Calling all_plugins_play to load vars for managed_node1 24160 1726853558.83990: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853558.83993: Calling groups_plugins_play to load vars for managed_node1 24160 1726853558.85149: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853558.86679: done with get_vars() 24160 1726853558.86698: done getting variables 24160 1726853558.86737: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Friday 20 September 2024 13:32:38 -0400 (0:00:00.083) 0:00:35.270 ****** 24160 1726853558.86765: entering _queue_task() for managed_node1/shell 24160 1726853558.87091: worker is 1 (out of 1 available) 24160 1726853558.87103: exiting _queue_task() for managed_node1/shell 24160 1726853558.87114: done queuing things up, now waiting for results queue to drain 24160 1726853558.87116: waiting for pending results... 24160 1726853558.87388: running TaskExecutor() for managed_node1/TASK: Check routes and DNS 24160 1726853558.87500: in run() - task 02083763-bbaf-5676-4eb4-00000000069f 24160 1726853558.87521: variable 'ansible_search_path' from source: unknown 24160 1726853558.87528: variable 'ansible_search_path' from source: unknown 24160 1726853558.87567: calling self._execute() 24160 1726853558.87672: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853558.87685: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853558.87699: variable 'omit' from source: magic vars 24160 1726853558.88068: variable 'ansible_distribution_major_version' from source: facts 24160 1726853558.88087: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853558.88100: variable 'omit' from source: magic vars 24160 1726853558.88140: variable 'omit' from source: magic vars 24160 1726853558.88186: variable 'omit' from source: magic vars 24160 1726853558.88230: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24160 1726853558.88277: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24160 1726853558.88302: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24160 1726853558.88378: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853558.88381: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853558.88384: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24160 1726853558.88386: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853558.88388: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853558.88492: Set connection var ansible_shell_executable to /bin/sh 24160 1726853558.88502: Set connection var ansible_pipelining to False 24160 1726853558.88508: Set connection var ansible_connection to ssh 24160 1726853558.88513: Set connection var ansible_shell_type to sh 24160 1726853558.88522: Set connection var ansible_module_compression to ZIP_DEFLATED 24160 1726853558.88531: Set connection var ansible_timeout to 10 24160 1726853558.88553: variable 'ansible_shell_executable' from source: unknown 24160 1726853558.88561: variable 'ansible_connection' from source: unknown 24160 1726853558.88567: variable 'ansible_module_compression' from source: unknown 24160 1726853558.88574: variable 'ansible_shell_type' from source: unknown 24160 1726853558.88580: variable 'ansible_shell_executable' from source: unknown 24160 1726853558.88591: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853558.88594: variable 'ansible_pipelining' from source: unknown 24160 1726853558.88677: variable 'ansible_timeout' from source: unknown 24160 1726853558.88681: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853558.88753: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 24160 1726853558.88773: variable 'omit' from source: magic vars 24160 1726853558.88786: starting attempt loop 24160 1726853558.88794: running the handler 24160 1726853558.88814: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 24160 1726853558.88839: _low_level_execute_command(): starting 24160 1726853558.88852: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24160 1726853558.89699: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853558.89716: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853558.89800: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853558.91498: stdout chunk (state=3): >>>/root <<< 24160 1726853558.91626: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853558.91642: stdout chunk (state=3): >>><<< 24160 1726853558.91663: stderr chunk (state=3): >>><<< 24160 1726853558.91688: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853558.91708: _low_level_execute_command(): starting 24160 1726853558.91792: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853558.9169545-25828-26802430314141 `" && echo ansible-tmp-1726853558.9169545-25828-26802430314141="` echo /root/.ansible/tmp/ansible-tmp-1726853558.9169545-25828-26802430314141 `" ) && sleep 0' 24160 1726853558.92353: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853558.92374: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853558.92389: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853558.92410: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24160 1726853558.92493: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853558.92552: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853558.92580: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853558.92594: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853558.92672: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853558.94622: stdout chunk (state=3): >>>ansible-tmp-1726853558.9169545-25828-26802430314141=/root/.ansible/tmp/ansible-tmp-1726853558.9169545-25828-26802430314141 <<< 24160 1726853558.94784: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853558.94788: stdout chunk (state=3): >>><<< 24160 1726853558.94790: stderr chunk (state=3): >>><<< 24160 1726853558.94976: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853558.9169545-25828-26802430314141=/root/.ansible/tmp/ansible-tmp-1726853558.9169545-25828-26802430314141 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853558.94979: variable 'ansible_module_compression' from source: unknown 24160 1726853558.94982: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24160jdl187cr/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 24160 1726853558.94984: variable 'ansible_facts' from source: unknown 24160 1726853558.95037: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853558.9169545-25828-26802430314141/AnsiballZ_command.py 24160 1726853558.95231: Sending initial data 24160 1726853558.95241: Sent initial data (155 bytes) 24160 1726853558.95887: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853558.95901: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853558.95917: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853558.95989: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853558.96040: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853558.96061: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853558.96096: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853558.96160: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853558.97778: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24160 1726853558.97851: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24160 1726853558.97909: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24160jdl187cr/tmpik6ncvkc /root/.ansible/tmp/ansible-tmp-1726853558.9169545-25828-26802430314141/AnsiballZ_command.py <<< 24160 1726853558.97922: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853558.9169545-25828-26802430314141/AnsiballZ_command.py" <<< 24160 1726853558.97961: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24160jdl187cr/tmpik6ncvkc" to remote "/root/.ansible/tmp/ansible-tmp-1726853558.9169545-25828-26802430314141/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853558.9169545-25828-26802430314141/AnsiballZ_command.py" <<< 24160 1726853558.98737: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853558.98795: stderr chunk (state=3): >>><<< 24160 1726853558.98837: stdout chunk (state=3): >>><<< 24160 1726853558.98840: done transferring module to remote 24160 1726853558.98857: _low_level_execute_command(): starting 24160 1726853558.98868: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853558.9169545-25828-26802430314141/ /root/.ansible/tmp/ansible-tmp-1726853558.9169545-25828-26802430314141/AnsiballZ_command.py && sleep 0' 24160 1726853558.99589: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853558.99662: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853558.99682: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853558.99737: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853558.99782: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853559.01622: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853559.01643: stderr chunk (state=3): >>><<< 24160 1726853559.01651: stdout chunk (state=3): >>><<< 24160 1726853559.01677: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853559.01761: _low_level_execute_command(): starting 24160 1726853559.01765: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853558.9169545-25828-26802430314141/AnsiballZ_command.py && sleep 0' 24160 1726853559.02358: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853559.02398: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853559.18801: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 02:3a:e7:40:bc:9f brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.45.153/22 brd 10.31.47.255 scope global dynamic noprefixroute eth0\n valid_lft 2889sec preferred_lft 2889sec\n inet6 fe80::3a:e7ff:fe40:bc9f/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.44.1 dev eth0 proto dhcp src 10.31.45.153 metric 100 \n10.31.44.0/22 dev eth0 proto kernel scope link src 10.31.45.153 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 13:32:39.176759", "end": "2024-09-20 13:32:39.185606", "delta": "0:00:00.008847", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 24160 1726853559.20333: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 24160 1726853559.20478: stderr chunk (state=3): >>><<< 24160 1726853559.20482: stdout chunk (state=3): >>><<< 24160 1726853559.20484: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 02:3a:e7:40:bc:9f brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.45.153/22 brd 10.31.47.255 scope global dynamic noprefixroute eth0\n valid_lft 2889sec preferred_lft 2889sec\n inet6 fe80::3a:e7ff:fe40:bc9f/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.44.1 dev eth0 proto dhcp src 10.31.45.153 metric 100 \n10.31.44.0/22 dev eth0 proto kernel scope link src 10.31.45.153 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 13:32:39.176759", "end": "2024-09-20 13:32:39.185606", "delta": "0:00:00.008847", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 24160 1726853559.20530: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853558.9169545-25828-26802430314141/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24160 1726853559.20547: _low_level_execute_command(): starting 24160 1726853559.20594: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853558.9169545-25828-26802430314141/ > /dev/null 2>&1 && sleep 0' 24160 1726853559.22021: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853559.22114: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853559.22216: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853559.22338: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853559.24385: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853559.24399: stdout chunk (state=3): >>><<< 24160 1726853559.24410: stderr chunk (state=3): >>><<< 24160 1726853559.24464: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853559.24512: handler run complete 24160 1726853559.24538: Evaluated conditional (False): False 24160 1726853559.24778: attempt loop complete, returning result 24160 1726853559.24781: _execute() done 24160 1726853559.24784: dumping result to json 24160 1726853559.24786: done dumping result, returning 24160 1726853559.24788: done running TaskExecutor() for managed_node1/TASK: Check routes and DNS [02083763-bbaf-5676-4eb4-00000000069f] 24160 1726853559.24790: sending task result for task 02083763-bbaf-5676-4eb4-00000000069f 24160 1726853559.24860: done sending task result for task 02083763-bbaf-5676-4eb4-00000000069f 24160 1726853559.24863: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.008847", "end": "2024-09-20 13:32:39.185606", "rc": 0, "start": "2024-09-20 13:32:39.176759" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 02:3a:e7:40:bc:9f brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.45.153/22 brd 10.31.47.255 scope global dynamic noprefixroute eth0 valid_lft 2889sec preferred_lft 2889sec inet6 fe80::3a:e7ff:fe40:bc9f/64 scope link noprefixroute valid_lft forever preferred_lft forever IP ROUTE default via 10.31.44.1 dev eth0 proto dhcp src 10.31.45.153 metric 100 10.31.44.0/22 dev eth0 proto kernel scope link src 10.31.45.153 metric 100 IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 24160 1726853559.24953: no more pending results, returning what we have 24160 1726853559.24957: results queue empty 24160 1726853559.24958: checking for any_errors_fatal 24160 1726853559.24959: done checking for any_errors_fatal 24160 1726853559.24960: checking for max_fail_percentage 24160 1726853559.24962: done checking for max_fail_percentage 24160 1726853559.24963: checking to see if all hosts have failed and the running result is not ok 24160 1726853559.24963: done checking to see if all hosts have failed 24160 1726853559.24964: getting the remaining hosts for this loop 24160 1726853559.24966: done getting the remaining hosts for this loop 24160 1726853559.24970: getting the next task for host managed_node1 24160 1726853559.24979: done getting next task for host managed_node1 24160 1726853559.24981: ^ task is: TASK: Verify DNS and network connectivity 24160 1726853559.24985: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853559.24989: getting variables 24160 1726853559.24990: in VariableManager get_vars() 24160 1726853559.25020: Calling all_inventory to load vars for managed_node1 24160 1726853559.25023: Calling groups_inventory to load vars for managed_node1 24160 1726853559.25027: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853559.25040: Calling all_plugins_play to load vars for managed_node1 24160 1726853559.25047: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853559.25050: Calling groups_plugins_play to load vars for managed_node1 24160 1726853559.27951: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853559.31229: done with get_vars() 24160 1726853559.31258: done getting variables 24160 1726853559.31315: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Friday 20 September 2024 13:32:39 -0400 (0:00:00.445) 0:00:35.716 ****** 24160 1726853559.31345: entering _queue_task() for managed_node1/shell 24160 1726853559.32103: worker is 1 (out of 1 available) 24160 1726853559.32116: exiting _queue_task() for managed_node1/shell 24160 1726853559.32130: done queuing things up, now waiting for results queue to drain 24160 1726853559.32131: waiting for pending results... 24160 1726853559.32799: running TaskExecutor() for managed_node1/TASK: Verify DNS and network connectivity 24160 1726853559.32992: in run() - task 02083763-bbaf-5676-4eb4-0000000006a0 24160 1726853559.33018: variable 'ansible_search_path' from source: unknown 24160 1726853559.33029: variable 'ansible_search_path' from source: unknown 24160 1726853559.33078: calling self._execute() 24160 1726853559.33331: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853559.33344: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853559.33400: variable 'omit' from source: magic vars 24160 1726853559.34260: variable 'ansible_distribution_major_version' from source: facts 24160 1726853559.34326: Evaluated conditional (ansible_distribution_major_version != '6'): True 24160 1726853559.34568: variable 'ansible_facts' from source: unknown 24160 1726853559.36888: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 24160 1726853559.36892: variable 'omit' from source: magic vars 24160 1726853559.36965: variable 'omit' from source: magic vars 24160 1726853559.37047: variable 'omit' from source: magic vars 24160 1726853559.37177: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24160 1726853559.37219: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24160 1726853559.37264: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24160 1726853559.37459: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853559.37463: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24160 1726853559.37465: variable 'inventory_hostname' from source: host vars for 'managed_node1' 24160 1726853559.37467: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853559.37469: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853559.37621: Set connection var ansible_shell_executable to /bin/sh 24160 1726853559.37685: Set connection var ansible_pipelining to False 24160 1726853559.37693: Set connection var ansible_connection to ssh 24160 1726853559.37699: Set connection var ansible_shell_type to sh 24160 1726853559.37710: Set connection var ansible_module_compression to ZIP_DEFLATED 24160 1726853559.37725: Set connection var ansible_timeout to 10 24160 1726853559.37752: variable 'ansible_shell_executable' from source: unknown 24160 1726853559.37794: variable 'ansible_connection' from source: unknown 24160 1726853559.37818: variable 'ansible_module_compression' from source: unknown 24160 1726853559.37976: variable 'ansible_shell_type' from source: unknown 24160 1726853559.37980: variable 'ansible_shell_executable' from source: unknown 24160 1726853559.37982: variable 'ansible_host' from source: host vars for 'managed_node1' 24160 1726853559.37985: variable 'ansible_pipelining' from source: unknown 24160 1726853559.37987: variable 'ansible_timeout' from source: unknown 24160 1726853559.37991: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 24160 1726853559.38293: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 24160 1726853559.38297: variable 'omit' from source: magic vars 24160 1726853559.38299: starting attempt loop 24160 1726853559.38301: running the handler 24160 1726853559.38304: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 24160 1726853559.38351: _low_level_execute_command(): starting 24160 1726853559.38385: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24160 1726853559.40399: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24160 1726853559.40404: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853559.40429: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853559.40481: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853559.40527: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853559.42212: stdout chunk (state=3): >>>/root <<< 24160 1726853559.42345: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853559.42574: stdout chunk (state=3): >>><<< 24160 1726853559.42578: stderr chunk (state=3): >>><<< 24160 1726853559.42582: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853559.42585: _low_level_execute_command(): starting 24160 1726853559.42590: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853559.4239447-25844-140942255535911 `" && echo ansible-tmp-1726853559.4239447-25844-140942255535911="` echo /root/.ansible/tmp/ansible-tmp-1726853559.4239447-25844-140942255535911 `" ) && sleep 0' 24160 1726853559.43726: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853559.43780: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853559.44007: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853559.44034: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853559.44086: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853559.46031: stdout chunk (state=3): >>>ansible-tmp-1726853559.4239447-25844-140942255535911=/root/.ansible/tmp/ansible-tmp-1726853559.4239447-25844-140942255535911 <<< 24160 1726853559.46199: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853559.46203: stdout chunk (state=3): >>><<< 24160 1726853559.46205: stderr chunk (state=3): >>><<< 24160 1726853559.46232: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853559.4239447-25844-140942255535911=/root/.ansible/tmp/ansible-tmp-1726853559.4239447-25844-140942255535911 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853559.46281: variable 'ansible_module_compression' from source: unknown 24160 1726853559.46339: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24160jdl187cr/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 24160 1726853559.46389: variable 'ansible_facts' from source: unknown 24160 1726853559.46485: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853559.4239447-25844-140942255535911/AnsiballZ_command.py 24160 1726853559.46745: Sending initial data 24160 1726853559.46748: Sent initial data (156 bytes) 24160 1726853559.47492: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853559.47617: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853559.47638: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853559.47721: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853559.49270: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 24160 1726853559.49326: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 24160 1726853559.49395: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24160jdl187cr/tmpfw_xz1hq /root/.ansible/tmp/ansible-tmp-1726853559.4239447-25844-140942255535911/AnsiballZ_command.py <<< 24160 1726853559.49413: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853559.4239447-25844-140942255535911/AnsiballZ_command.py" <<< 24160 1726853559.49440: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-24160jdl187cr/tmpfw_xz1hq" to remote "/root/.ansible/tmp/ansible-tmp-1726853559.4239447-25844-140942255535911/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853559.4239447-25844-140942255535911/AnsiballZ_command.py" <<< 24160 1726853559.50308: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853559.50311: stdout chunk (state=3): >>><<< 24160 1726853559.50314: stderr chunk (state=3): >>><<< 24160 1726853559.50316: done transferring module to remote 24160 1726853559.50318: _low_level_execute_command(): starting 24160 1726853559.50320: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853559.4239447-25844-140942255535911/ /root/.ansible/tmp/ansible-tmp-1726853559.4239447-25844-140942255535911/AnsiballZ_command.py && sleep 0' 24160 1726853559.50940: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853559.51015: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853559.51082: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853559.51086: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853559.51112: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853559.52978: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853559.52982: stdout chunk (state=3): >>><<< 24160 1726853559.52984: stderr chunk (state=3): >>><<< 24160 1726853559.52987: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853559.52993: _low_level_execute_command(): starting 24160 1726853559.52996: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853559.4239447-25844-140942255535911/AnsiballZ_command.py && sleep 0' 24160 1726853559.54346: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853559.54522: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853559.54563: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853559.80107: stdout chunk (state=3): >>> {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 4733 0 --:--:-- --:--:-- --:--:-- 4765\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 16734 0 --:--:-- --:--:-- --:--:-- 17117", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 13:32:39.696157", "end": "2024-09-20 13:32:39.799775", "delta": "0:00:00.103618", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 24160 1726853559.81766: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 24160 1726853559.81791: stderr chunk (state=3): >>><<< 24160 1726853559.81795: stdout chunk (state=3): >>><<< 24160 1726853559.81814: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 4733 0 --:--:-- --:--:-- --:--:-- 4765\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 16734 0 --:--:-- --:--:-- --:--:-- 17117", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 13:32:39.696157", "end": "2024-09-20 13:32:39.799775", "delta": "0:00:00.103618", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 24160 1726853559.81848: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853559.4239447-25844-140942255535911/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24160 1726853559.81855: _low_level_execute_command(): starting 24160 1726853559.81863: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853559.4239447-25844-140942255535911/ > /dev/null 2>&1 && sleep 0' 24160 1726853559.82266: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24160 1726853559.82276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853559.82293: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 24160 1726853559.82297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24160 1726853559.82353: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 24160 1726853559.82359: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24160 1726853559.82361: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24160 1726853559.82397: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24160 1726853559.84194: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24160 1726853559.84212: stderr chunk (state=3): >>><<< 24160 1726853559.84215: stdout chunk (state=3): >>><<< 24160 1726853559.84226: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24160 1726853559.84233: handler run complete 24160 1726853559.84255: Evaluated conditional (False): False 24160 1726853559.84264: attempt loop complete, returning result 24160 1726853559.84267: _execute() done 24160 1726853559.84270: dumping result to json 24160 1726853559.84274: done dumping result, returning 24160 1726853559.84283: done running TaskExecutor() for managed_node1/TASK: Verify DNS and network connectivity [02083763-bbaf-5676-4eb4-0000000006a0] 24160 1726853559.84285: sending task result for task 02083763-bbaf-5676-4eb4-0000000006a0 24160 1726853559.84389: done sending task result for task 02083763-bbaf-5676-4eb4-0000000006a0 24160 1726853559.84392: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.103618", "end": "2024-09-20 13:32:39.799775", "rc": 0, "start": "2024-09-20 13:32:39.696157" } STDOUT: CHECK DNS AND CONNECTIVITY 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 4733 0 --:--:-- --:--:-- --:--:-- 4765 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 16734 0 --:--:-- --:--:-- --:--:-- 17117 24160 1726853559.84461: no more pending results, returning what we have 24160 1726853559.84465: results queue empty 24160 1726853559.84466: checking for any_errors_fatal 24160 1726853559.84475: done checking for any_errors_fatal 24160 1726853559.84476: checking for max_fail_percentage 24160 1726853559.84477: done checking for max_fail_percentage 24160 1726853559.84479: checking to see if all hosts have failed and the running result is not ok 24160 1726853559.84479: done checking to see if all hosts have failed 24160 1726853559.84480: getting the remaining hosts for this loop 24160 1726853559.84481: done getting the remaining hosts for this loop 24160 1726853559.84488: getting the next task for host managed_node1 24160 1726853559.84496: done getting next task for host managed_node1 24160 1726853559.84499: ^ task is: TASK: meta (flush_handlers) 24160 1726853559.84501: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853559.84506: getting variables 24160 1726853559.84507: in VariableManager get_vars() 24160 1726853559.84534: Calling all_inventory to load vars for managed_node1 24160 1726853559.84536: Calling groups_inventory to load vars for managed_node1 24160 1726853559.84539: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853559.84550: Calling all_plugins_play to load vars for managed_node1 24160 1726853559.84552: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853559.84557: Calling groups_plugins_play to load vars for managed_node1 24160 1726853559.88495: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853559.89459: done with get_vars() 24160 1726853559.89482: done getting variables 24160 1726853559.89535: in VariableManager get_vars() 24160 1726853559.89544: Calling all_inventory to load vars for managed_node1 24160 1726853559.89547: Calling groups_inventory to load vars for managed_node1 24160 1726853559.89549: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853559.89556: Calling all_plugins_play to load vars for managed_node1 24160 1726853559.89559: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853559.89562: Calling groups_plugins_play to load vars for managed_node1 24160 1726853559.90523: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853559.91453: done with get_vars() 24160 1726853559.91470: done queuing things up, now waiting for results queue to drain 24160 1726853559.91473: results queue empty 24160 1726853559.91474: checking for any_errors_fatal 24160 1726853559.91476: done checking for any_errors_fatal 24160 1726853559.91477: checking for max_fail_percentage 24160 1726853559.91478: done checking for max_fail_percentage 24160 1726853559.91478: checking to see if all hosts have failed and the running result is not ok 24160 1726853559.91479: done checking to see if all hosts have failed 24160 1726853559.91479: getting the remaining hosts for this loop 24160 1726853559.91480: done getting the remaining hosts for this loop 24160 1726853559.91482: getting the next task for host managed_node1 24160 1726853559.91484: done getting next task for host managed_node1 24160 1726853559.91485: ^ task is: TASK: meta (flush_handlers) 24160 1726853559.91486: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853559.91487: getting variables 24160 1726853559.91488: in VariableManager get_vars() 24160 1726853559.91493: Calling all_inventory to load vars for managed_node1 24160 1726853559.91494: Calling groups_inventory to load vars for managed_node1 24160 1726853559.91496: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853559.91499: Calling all_plugins_play to load vars for managed_node1 24160 1726853559.91501: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853559.91502: Calling groups_plugins_play to load vars for managed_node1 24160 1726853559.92125: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853559.93693: done with get_vars() 24160 1726853559.93713: done getting variables 24160 1726853559.93759: in VariableManager get_vars() 24160 1726853559.93767: Calling all_inventory to load vars for managed_node1 24160 1726853559.93769: Calling groups_inventory to load vars for managed_node1 24160 1726853559.93773: Calling all_plugins_inventory to load vars for managed_node1 24160 1726853559.93778: Calling all_plugins_play to load vars for managed_node1 24160 1726853559.93780: Calling groups_plugins_inventory to load vars for managed_node1 24160 1726853559.93783: Calling groups_plugins_play to load vars for managed_node1 24160 1726853559.94961: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24160 1726853559.96533: done with get_vars() 24160 1726853559.96558: done queuing things up, now waiting for results queue to drain 24160 1726853559.96560: results queue empty 24160 1726853559.96561: checking for any_errors_fatal 24160 1726853559.96562: done checking for any_errors_fatal 24160 1726853559.96563: checking for max_fail_percentage 24160 1726853559.96564: done checking for max_fail_percentage 24160 1726853559.96565: checking to see if all hosts have failed and the running result is not ok 24160 1726853559.96566: done checking to see if all hosts have failed 24160 1726853559.96567: getting the remaining hosts for this loop 24160 1726853559.96567: done getting the remaining hosts for this loop 24160 1726853559.96570: getting the next task for host managed_node1 24160 1726853559.96575: done getting next task for host managed_node1 24160 1726853559.96576: ^ task is: None 24160 1726853559.96577: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24160 1726853559.96578: done queuing things up, now waiting for results queue to drain 24160 1726853559.96579: results queue empty 24160 1726853559.96580: checking for any_errors_fatal 24160 1726853559.96580: done checking for any_errors_fatal 24160 1726853559.96581: checking for max_fail_percentage 24160 1726853559.96582: done checking for max_fail_percentage 24160 1726853559.96583: checking to see if all hosts have failed and the running result is not ok 24160 1726853559.96583: done checking to see if all hosts have failed 24160 1726853559.96584: getting the next task for host managed_node1 24160 1726853559.96586: done getting next task for host managed_node1 24160 1726853559.96587: ^ task is: None 24160 1726853559.96588: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node1 : ok=75 changed=2 unreachable=0 failed=0 skipped=75 rescued=0 ignored=1 Friday 20 September 2024 13:32:39 -0400 (0:00:00.652) 0:00:36.369 ****** =============================================================================== fedora.linux_system_roles.network : Check which services are running ---- 1.96s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.85s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.84s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Gathering Facts --------------------------------------------------------- 1.64s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_disabled_nm.yml:6 fedora.linux_system_roles.network : Check which packages are installed --- 1.15s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Create veth interface ethtest0 ------------------------------------------ 1.13s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.04s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Gathering Facts --------------------------------------------------------- 1.02s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 1.02s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Gathering Facts --------------------------------------------------------- 0.97s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 fedora.linux_system_roles.network : Check which packages are installed --- 0.92s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 0.89s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:3 fedora.linux_system_roles.network : Check which packages are installed --- 0.88s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 0.88s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:80 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.85s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Check if system is ostree ----------------------------------------------- 0.80s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Install iproute --------------------------------------------------------- 0.76s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Gather the minimum subset of ansible_facts required by the network role test --- 0.73s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.65s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Verify DNS and network connectivity ------------------------------------- 0.65s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 24160 1726853559.96676: RUNNING CLEANUP