[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 22286 1726882776.06155: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-4FB executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 7 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 22286 1726882776.06756: Added group all to inventory 22286 1726882776.06759: Added group ungrouped to inventory 22286 1726882776.06764: Group all now contains ungrouped 22286 1726882776.06768: Examining possible inventory source: /tmp/network-lQx/inventory.yml 22286 1726882776.30543: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 22286 1726882776.30625: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 22286 1726882776.30661: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 22286 1726882776.30738: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 22286 1726882776.30839: Loaded config def from plugin (inventory/script) 22286 1726882776.30842: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 22286 1726882776.30898: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 22286 1726882776.31016: Loaded config def from plugin (inventory/yaml) 22286 1726882776.31019: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 22286 1726882776.31133: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 22286 1726882776.31718: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 22286 1726882776.31722: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 22286 1726882776.31726: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 22286 1726882776.31732: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 22286 1726882776.31740: Loading data from /tmp/network-lQx/inventory.yml 22286 1726882776.31839: /tmp/network-lQx/inventory.yml was not parsable by auto 22286 1726882776.31924: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 22286 1726882776.31979: Loading data from /tmp/network-lQx/inventory.yml 22286 1726882776.32093: group all already in inventory 22286 1726882776.32101: set inventory_file for managed_node1 22286 1726882776.32106: set inventory_dir for managed_node1 22286 1726882776.32107: Added host managed_node1 to inventory 22286 1726882776.32110: Added host managed_node1 to group all 22286 1726882776.32111: set ansible_host for managed_node1 22286 1726882776.32112: set ansible_ssh_extra_args for managed_node1 22286 1726882776.32116: set inventory_file for managed_node2 22286 1726882776.32120: set inventory_dir for managed_node2 22286 1726882776.32121: Added host managed_node2 to inventory 22286 1726882776.32123: Added host managed_node2 to group all 22286 1726882776.32124: set ansible_host for managed_node2 22286 1726882776.32125: set ansible_ssh_extra_args for managed_node2 22286 1726882776.32129: set inventory_file for managed_node3 22286 1726882776.32132: set inventory_dir for managed_node3 22286 1726882776.32133: Added host managed_node3 to inventory 22286 1726882776.32136: Added host managed_node3 to group all 22286 1726882776.32137: set ansible_host for managed_node3 22286 1726882776.32139: set ansible_ssh_extra_args for managed_node3 22286 1726882776.32142: Reconcile groups and hosts in inventory. 22286 1726882776.32147: Group ungrouped now contains managed_node1 22286 1726882776.32150: Group ungrouped now contains managed_node2 22286 1726882776.32152: Group ungrouped now contains managed_node3 22286 1726882776.32249: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 22286 1726882776.32423: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 22286 1726882776.32490: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 22286 1726882776.32529: Loaded config def from plugin (vars/host_group_vars) 22286 1726882776.32532: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 22286 1726882776.32542: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 22286 1726882776.32552: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 22286 1726882776.32606: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 22286 1726882776.33005: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882776.33122: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 22286 1726882776.33185: Loaded config def from plugin (connection/local) 22286 1726882776.33189: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 22286 1726882776.34118: Loaded config def from plugin (connection/paramiko_ssh) 22286 1726882776.34122: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 22286 1726882776.35341: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 22286 1726882776.35401: Loaded config def from plugin (connection/psrp) 22286 1726882776.35405: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 22286 1726882776.36485: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 22286 1726882776.36542: Loaded config def from plugin (connection/ssh) 22286 1726882776.36550: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 22286 1726882776.39105: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 22286 1726882776.39167: Loaded config def from plugin (connection/winrm) 22286 1726882776.39171: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 22286 1726882776.39209: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 22286 1726882776.39290: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 22286 1726882776.39395: Loaded config def from plugin (shell/cmd) 22286 1726882776.39398: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 22286 1726882776.39430: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 22286 1726882776.39537: Loaded config def from plugin (shell/powershell) 22286 1726882776.39539: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 22286 1726882776.39607: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 22286 1726882776.39871: Loaded config def from plugin (shell/sh) 22286 1726882776.39874: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 22286 1726882776.39915: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 22286 1726882776.40102: Loaded config def from plugin (become/runas) 22286 1726882776.40105: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 22286 1726882776.40381: Loaded config def from plugin (become/su) 22286 1726882776.40384: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 22286 1726882776.40625: Loaded config def from plugin (become/sudo) 22286 1726882776.40628: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 22286 1726882776.40673: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_nm.yml 22286 1726882776.41110: in VariableManager get_vars() 22286 1726882776.41142: done with get_vars() 22286 1726882776.41306: trying /usr/local/lib/python3.12/site-packages/ansible/modules 22286 1726882776.44999: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 22286 1726882776.45160: in VariableManager get_vars() 22286 1726882776.45166: done with get_vars() 22286 1726882776.45169: variable 'playbook_dir' from source: magic vars 22286 1726882776.45175: variable 'ansible_playbook_python' from source: magic vars 22286 1726882776.45176: variable 'ansible_config_file' from source: magic vars 22286 1726882776.45177: variable 'groups' from source: magic vars 22286 1726882776.45178: variable 'omit' from source: magic vars 22286 1726882776.45179: variable 'ansible_version' from source: magic vars 22286 1726882776.45180: variable 'ansible_check_mode' from source: magic vars 22286 1726882776.45181: variable 'ansible_diff_mode' from source: magic vars 22286 1726882776.45182: variable 'ansible_forks' from source: magic vars 22286 1726882776.45183: variable 'ansible_inventory_sources' from source: magic vars 22286 1726882776.45184: variable 'ansible_skip_tags' from source: magic vars 22286 1726882776.45185: variable 'ansible_limit' from source: magic vars 22286 1726882776.45186: variable 'ansible_run_tags' from source: magic vars 22286 1726882776.45187: variable 'ansible_verbosity' from source: magic vars 22286 1726882776.45232: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml 22286 1726882776.46013: in VariableManager get_vars() 22286 1726882776.46032: done with get_vars() 22286 1726882776.46086: in VariableManager get_vars() 22286 1726882776.46103: done with get_vars() 22286 1726882776.46456: in VariableManager get_vars() 22286 1726882776.46478: done with get_vars() 22286 1726882776.46486: variable 'omit' from source: magic vars 22286 1726882776.46509: variable 'omit' from source: magic vars 22286 1726882776.46555: in VariableManager get_vars() 22286 1726882776.46568: done with get_vars() 22286 1726882776.46632: in VariableManager get_vars() 22286 1726882776.46651: done with get_vars() 22286 1726882776.46700: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 22286 1726882776.47011: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 22286 1726882776.47203: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 22286 1726882776.48194: in VariableManager get_vars() 22286 1726882776.48223: done with get_vars() 22286 1726882776.48772: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 22286 1726882776.48975: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 22286 1726882776.51205: in VariableManager get_vars() 22286 1726882776.51226: done with get_vars() 22286 1726882776.51280: in VariableManager get_vars() 22286 1726882776.51315: done with get_vars() 22286 1726882776.52259: in VariableManager get_vars() 22286 1726882776.52280: done with get_vars() 22286 1726882776.52286: variable 'omit' from source: magic vars 22286 1726882776.52299: variable 'omit' from source: magic vars 22286 1726882776.52342: in VariableManager get_vars() 22286 1726882776.52364: done with get_vars() 22286 1726882776.52390: in VariableManager get_vars() 22286 1726882776.52409: done with get_vars() 22286 1726882776.52444: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 22286 1726882776.52612: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 22286 1726882776.52817: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 22286 1726882776.54870: in VariableManager get_vars() 22286 1726882776.54904: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 22286 1726882776.57527: in VariableManager get_vars() 22286 1726882776.57554: done with get_vars() 22286 1726882776.57716: in VariableManager get_vars() 22286 1726882776.57746: done with get_vars() 22286 1726882776.57811: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 22286 1726882776.57829: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 22286 1726882776.58126: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 22286 1726882776.58365: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 22286 1726882776.58368: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-4FB/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) 22286 1726882776.58412: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 22286 1726882776.58446: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 22286 1726882776.58694: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 22286 1726882776.58785: Loaded config def from plugin (callback/default) 22286 1726882776.58788: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 22286 1726882776.60312: Loaded config def from plugin (callback/junit) 22286 1726882776.60315: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 22286 1726882776.60376: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 22286 1726882776.60473: Loaded config def from plugin (callback/minimal) 22286 1726882776.60476: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 22286 1726882776.60524: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 22286 1726882776.60608: Loaded config def from plugin (callback/tree) 22286 1726882776.60611: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 22286 1726882776.60769: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 22286 1726882776.60772: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-4FB/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_ipv6_nm.yml **************************************************** 2 plays in /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_nm.yml 22286 1726882776.60807: in VariableManager get_vars() 22286 1726882776.60822: done with get_vars() 22286 1726882776.60828: in VariableManager get_vars() 22286 1726882776.60840: done with get_vars() 22286 1726882776.60845: variable 'omit' from source: magic vars 22286 1726882776.60891: in VariableManager get_vars() 22286 1726882776.60912: done with get_vars() 22286 1726882776.60939: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_ipv6.yml' with nm as provider] ************* 22286 1726882776.61605: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 22286 1726882776.61697: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 22286 1726882776.61853: getting the remaining hosts for this loop 22286 1726882776.61855: done getting the remaining hosts for this loop 22286 1726882776.61858: getting the next task for host managed_node3 22286 1726882776.61862: done getting next task for host managed_node3 22286 1726882776.61864: ^ task is: TASK: Gathering Facts 22286 1726882776.61866: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882776.61869: getting variables 22286 1726882776.61870: in VariableManager get_vars() 22286 1726882776.61940: Calling all_inventory to load vars for managed_node3 22286 1726882776.61943: Calling groups_inventory to load vars for managed_node3 22286 1726882776.61947: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882776.61961: Calling all_plugins_play to load vars for managed_node3 22286 1726882776.61974: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882776.61979: Calling groups_plugins_play to load vars for managed_node3 22286 1726882776.62038: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882776.62106: done with get_vars() 22286 1726882776.62114: done getting variables 22286 1726882776.62189: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_nm.yml:6 Friday 20 September 2024 21:39:36 -0400 (0:00:00.015) 0:00:00.015 ****** 22286 1726882776.62217: entering _queue_task() for managed_node3/gather_facts 22286 1726882776.62219: Creating lock for gather_facts 22286 1726882776.62608: worker is 1 (out of 1 available) 22286 1726882776.62617: exiting _queue_task() for managed_node3/gather_facts 22286 1726882776.62637: done queuing things up, now waiting for results queue to drain 22286 1726882776.62639: waiting for pending results... 22286 1726882776.62903: running TaskExecutor() for managed_node3/TASK: Gathering Facts 22286 1726882776.63021: in run() - task 0affe814-3a2d-a75d-4836-0000000000b9 22286 1726882776.63047: variable 'ansible_search_path' from source: unknown 22286 1726882776.63098: calling self._execute() 22286 1726882776.63174: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882776.63188: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882776.63214: variable 'omit' from source: magic vars 22286 1726882776.63345: variable 'omit' from source: magic vars 22286 1726882776.63382: variable 'omit' from source: magic vars 22286 1726882776.63438: variable 'omit' from source: magic vars 22286 1726882776.63526: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22286 1726882776.63545: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22286 1726882776.63572: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22286 1726882776.63598: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882776.63616: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882776.63662: variable 'inventory_hostname' from source: host vars for 'managed_node3' 22286 1726882776.63742: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882776.63745: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882776.63830: Set connection var ansible_shell_executable to /bin/sh 22286 1726882776.63853: Set connection var ansible_module_compression to ZIP_DEFLATED 22286 1726882776.63865: Set connection var ansible_connection to ssh 22286 1726882776.63872: Set connection var ansible_shell_type to sh 22286 1726882776.63884: Set connection var ansible_timeout to 10 22286 1726882776.63899: Set connection var ansible_pipelining to False 22286 1726882776.63928: variable 'ansible_shell_executable' from source: unknown 22286 1726882776.63939: variable 'ansible_connection' from source: unknown 22286 1726882776.63948: variable 'ansible_module_compression' from source: unknown 22286 1726882776.63969: variable 'ansible_shell_type' from source: unknown 22286 1726882776.63973: variable 'ansible_shell_executable' from source: unknown 22286 1726882776.64039: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882776.64043: variable 'ansible_pipelining' from source: unknown 22286 1726882776.64046: variable 'ansible_timeout' from source: unknown 22286 1726882776.64048: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882776.64267: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22286 1726882776.64290: variable 'omit' from source: magic vars 22286 1726882776.64305: starting attempt loop 22286 1726882776.64441: running the handler 22286 1726882776.64444: variable 'ansible_facts' from source: unknown 22286 1726882776.64447: _low_level_execute_command(): starting 22286 1726882776.64449: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22286 1726882776.65243: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882776.65273: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882776.65293: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882776.65318: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882776.65481: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882776.67363: stdout chunk (state=3): >>>/root <<< 22286 1726882776.67567: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882776.67570: stdout chunk (state=3): >>><<< 22286 1726882776.67573: stderr chunk (state=3): >>><<< 22286 1726882776.67601: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882776.67621: _low_level_execute_command(): starting 22286 1726882776.67632: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882776.6760824-22308-113012573746923 `" && echo ansible-tmp-1726882776.6760824-22308-113012573746923="` echo /root/.ansible/tmp/ansible-tmp-1726882776.6760824-22308-113012573746923 `" ) && sleep 0' 22286 1726882776.68305: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882776.68321: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882776.68337: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882776.68374: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22286 1726882776.68481: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882776.68537: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882776.68662: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882776.70764: stdout chunk (state=3): >>>ansible-tmp-1726882776.6760824-22308-113012573746923=/root/.ansible/tmp/ansible-tmp-1726882776.6760824-22308-113012573746923 <<< 22286 1726882776.70975: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882776.70979: stdout chunk (state=3): >>><<< 22286 1726882776.70982: stderr chunk (state=3): >>><<< 22286 1726882776.71001: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882776.6760824-22308-113012573746923=/root/.ansible/tmp/ansible-tmp-1726882776.6760824-22308-113012573746923 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882776.71049: variable 'ansible_module_compression' from source: unknown 22286 1726882776.71146: ANSIBALLZ: Using generic lock for ansible.legacy.setup 22286 1726882776.71149: ANSIBALLZ: Acquiring lock 22286 1726882776.71152: ANSIBALLZ: Lock acquired: 140212085117232 22286 1726882776.71154: ANSIBALLZ: Creating module 22286 1726882777.21099: ANSIBALLZ: Writing module into payload 22286 1726882777.21310: ANSIBALLZ: Writing module 22286 1726882777.21357: ANSIBALLZ: Renaming module 22286 1726882777.21370: ANSIBALLZ: Done creating module 22286 1726882777.21409: variable 'ansible_facts' from source: unknown 22286 1726882777.21422: variable 'inventory_hostname' from source: host vars for 'managed_node3' 22286 1726882777.21438: _low_level_execute_command(): starting 22286 1726882777.21454: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 22286 1726882777.22102: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882777.22117: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882777.22133: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882777.22153: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22286 1726882777.22170: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 <<< 22286 1726882777.22196: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882777.22303: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882777.22317: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882777.22474: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 22286 1726882777.24729: stdout chunk (state=3): >>>PLATFORM <<< 22286 1726882777.24733: stdout chunk (state=3): >>>Linux FOUND <<< 22286 1726882777.24739: stdout chunk (state=3): >>>/usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 22286 1726882777.24928: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882777.24931: stdout chunk (state=3): >>><<< 22286 1726882777.24942: stderr chunk (state=3): >>><<< 22286 1726882777.24964: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 22286 1726882777.24973 [managed_node3]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 22286 1726882777.25029: _low_level_execute_command(): starting 22286 1726882777.25032: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 22286 1726882777.25396: Sending initial data 22286 1726882777.25400: Sent initial data (1181 bytes) 22286 1726882777.26403: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration <<< 22286 1726882777.26450: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882777.26519: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882777.26626: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882777.26686: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882777.26847: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882777.30683: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"Fedora Linux\"\nVERSION=\"39 (Thirty Nine)\"\nID=fedora\nVERSION_ID=39\nVERSION_CODENAME=\"\"\nPLATFORM_ID=\"platform:f39\"\nPRETTY_NAME=\"Fedora Linux 39 (Thirty Nine)\"\nANSI_COLOR=\"0;38;2;60;110;180\"\nLOGO=fedora-logo-icon\nCPE_NAME=\"cpe:/o:fedoraproject:fedora:39\"\nDEFAULT_HOSTNAME=\"fedora\"\nHOME_URL=\"https://fedoraproject.org/\"\nDOCUMENTATION_URL=\"https://docs.fedoraproject.org/en-US/fedora/f39/system-administrators-guide/\"\nSUPPORT_URL=\"https://ask.fedoraproject.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_BUGZILLA_PRODUCT=\"Fedora\"\nREDHAT_BUGZILLA_PRODUCT_VERSION=39\nREDHAT_SUPPORT_PRODUCT=\"Fedora\"\nREDHAT_SUPPORT_PRODUCT_VERSION=39\nSUPPORT_END=2024-11-12\n"} <<< 22286 1726882777.31339: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882777.31343: stdout chunk (state=3): >>><<< 22286 1726882777.31345: stderr chunk (state=3): >>><<< 22286 1726882777.31348: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"Fedora Linux\"\nVERSION=\"39 (Thirty Nine)\"\nID=fedora\nVERSION_ID=39\nVERSION_CODENAME=\"\"\nPLATFORM_ID=\"platform:f39\"\nPRETTY_NAME=\"Fedora Linux 39 (Thirty Nine)\"\nANSI_COLOR=\"0;38;2;60;110;180\"\nLOGO=fedora-logo-icon\nCPE_NAME=\"cpe:/o:fedoraproject:fedora:39\"\nDEFAULT_HOSTNAME=\"fedora\"\nHOME_URL=\"https://fedoraproject.org/\"\nDOCUMENTATION_URL=\"https://docs.fedoraproject.org/en-US/fedora/f39/system-administrators-guide/\"\nSUPPORT_URL=\"https://ask.fedoraproject.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_BUGZILLA_PRODUCT=\"Fedora\"\nREDHAT_BUGZILLA_PRODUCT_VERSION=39\nREDHAT_SUPPORT_PRODUCT=\"Fedora\"\nREDHAT_SUPPORT_PRODUCT_VERSION=39\nSUPPORT_END=2024-11-12\n"} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882777.31350: variable 'ansible_facts' from source: unknown 22286 1726882777.31352: variable 'ansible_facts' from source: unknown 22286 1726882777.31354: variable 'ansible_module_compression' from source: unknown 22286 1726882777.31356: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22286nvkga28q/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 22286 1726882777.31358: variable 'ansible_facts' from source: unknown 22286 1726882777.31931: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882776.6760824-22308-113012573746923/AnsiballZ_setup.py 22286 1726882777.32254: Sending initial data 22286 1726882777.32265: Sent initial data (154 bytes) 22286 1726882777.33559: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 22286 1726882777.33667: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882777.33844: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882777.33959: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882777.34105: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882777.35823: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 22286 1726882777.35913: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22286 1726882777.35942: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22286 1726882777.36052: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22286nvkga28q/tmpawvlh2y0 /root/.ansible/tmp/ansible-tmp-1726882776.6760824-22308-113012573746923/AnsiballZ_setup.py <<< 22286 1726882777.36057: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882776.6760824-22308-113012573746923/AnsiballZ_setup.py" <<< 22286 1726882777.36183: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-22286nvkga28q/tmpawvlh2y0" to remote "/root/.ansible/tmp/ansible-tmp-1726882776.6760824-22308-113012573746923/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882776.6760824-22308-113012573746923/AnsiballZ_setup.py" <<< 22286 1726882777.40713: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882777.40895: stderr chunk (state=3): >>><<< 22286 1726882777.40905: stdout chunk (state=3): >>><<< 22286 1726882777.40996: done transferring module to remote 22286 1726882777.41017: _low_level_execute_command(): starting 22286 1726882777.41049: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882776.6760824-22308-113012573746923/ /root/.ansible/tmp/ansible-tmp-1726882776.6760824-22308-113012573746923/AnsiballZ_setup.py && sleep 0' 22286 1726882777.42356: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882777.42408: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882777.42444: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882777.42552: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882777.42581: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882777.42687: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882777.44697: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882777.44783: stderr chunk (state=3): >>><<< 22286 1726882777.44962: stdout chunk (state=3): >>><<< 22286 1726882777.44974: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882777.44978: _low_level_execute_command(): starting 22286 1726882777.44980: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882776.6760824-22308-113012573746923/AnsiballZ_setup.py && sleep 0' 22286 1726882777.45758: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882777.45782: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882777.45799: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882777.45818: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22286 1726882777.45839: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 <<< 22286 1726882777.45856: stderr chunk (state=3): >>>debug2: match not found <<< 22286 1726882777.45956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882777.45959: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882777.46011: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882777.46114: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882777.46273: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882777.49884: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # <<< 22286 1726882777.49888: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 22286 1726882777.49891: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 22286 1726882777.49936: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1196b2c530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1196afbb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 22286 1726882777.49959: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1196b2eab0> <<< 22286 1726882777.50163: stdout chunk (state=3): >>>import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # <<< 22286 1726882777.50172: stdout chunk (state=3): >>>import '_collections_abc' # <<< 22286 1726882777.50194: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 22286 1726882777.50220: stdout chunk (state=3): >>>import 'os' # <<< 22286 1726882777.50285: stdout chunk (state=3): >>>import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' <<< 22286 1726882777.50301: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 22286 1726882777.50319: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py <<< 22286 1726882777.50451: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11968dd160> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11968ddfd0> <<< 22286 1726882777.50467: stdout chunk (state=3): >>>import 'site' # <<< 22286 1726882777.50499: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 7 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 22286 1726882777.50910: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 22286 1726882777.50930: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 22286 1726882777.51010: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 22286 1726882777.51016: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 22286 1726882777.51020: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 22286 1726882777.51037: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 22286 1726882777.51056: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 22286 1726882777.51083: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 22286 1726882777.51391: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f119691bda0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f119691bfe0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # <<< 22286 1726882777.51423: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11969537d0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1196953e60> <<< 22286 1726882777.51458: stdout chunk (state=3): >>>import '_collections' # <<< 22286 1726882777.51555: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1196933a70> <<< 22286 1726882777.51583: stdout chunk (state=3): >>>import '_functools' # <<< 22286 1726882777.51741: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1196931190> <<< 22286 1726882777.52020: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1196918f50> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 22286 1726882777.52027: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py <<< 22286 1726882777.52050: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 22286 1726882777.52104: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11969776b0> <<< 22286 1726882777.52131: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11969762d0> <<< 22286 1726882777.52168: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py <<< 22286 1726882777.52373: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1196932030> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f119691ae40> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11969a8770> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11969181d0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 22286 1726882777.52377: stdout chunk (state=3): >>>import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11969a8c20> <<< 22286 1726882777.52392: stdout chunk (state=3): >>>import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11969a8ad0> <<< 22286 1726882777.52438: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 22286 1726882777.52473: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 22286 1726882777.52485: stdout chunk (state=3): >>>import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11969a8e90> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1196916d20> <<< 22286 1726882777.52547: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py <<< 22286 1726882777.52566: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 22286 1726882777.52674: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 22286 1726882777.52707: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 22286 1726882777.52752: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11969a9520> <<< 22286 1726882777.52757: stdout chunk (state=3): >>>import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11969a91f0> <<< 22286 1726882777.52785: stdout chunk (state=3): >>>import 'importlib.machinery' # <<< 22286 1726882777.52814: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py <<< 22286 1726882777.52945: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11969aa420> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 22286 1726882777.52969: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 22286 1726882777.53015: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py <<< 22286 1726882777.53037: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' <<< 22286 1726882777.53064: stdout chunk (state=3): >>>import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11969c4620> <<< 22286 1726882777.53067: stdout chunk (state=3): >>>import 'errno' # <<< 22286 1726882777.53142: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 22286 1726882777.53146: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 22286 1726882777.53156: stdout chunk (state=3): >>>import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11969c5d60> <<< 22286 1726882777.53176: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 22286 1726882777.53217: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 22286 1726882777.53450: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11969c6c60> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11969c72c0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11969c61b0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11969c7d40> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11969c7470> <<< 22286 1726882777.53666: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11969aa390> <<< 22286 1726882777.53705: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 22286 1726882777.54004: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11966cfcb0> <<< 22286 1726882777.54008: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11966f8770> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11966f84d0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11966f87a0> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11966f8980> <<< 22286 1726882777.54243: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11966cde50> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11966fa000> <<< 22286 1726882777.54266: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11966f8c80> <<< 22286 1726882777.54304: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11969aab40> <<< 22286 1726882777.54344: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 22286 1726882777.54426: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 22286 1726882777.54467: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 22286 1726882777.54539: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 22286 1726882777.54588: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f119672a360> <<< 22286 1726882777.54660: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 22286 1726882777.54697: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 22286 1726882777.54724: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 22286 1726882777.54764: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 22286 1726882777.54857: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1196742510> <<< 22286 1726882777.54893: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 22286 1726882777.54952: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 22286 1726882777.55106: stdout chunk (state=3): >>>import 'ntpath' # <<< 22286 1726882777.55120: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f119677b2f0> <<< 22286 1726882777.55161: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 22286 1726882777.55319: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 22286 1726882777.55337: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 22286 1726882777.55749: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11967a1a90> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f119677b410> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11967431a0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11965c03e0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1196741550> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11966faf00> <<< 22286 1726882777.55989: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 22286 1726882777.56023: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f11965c05c0> <<< 22286 1726882777.56349: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_n635dq3f/ansible_ansible.legacy.setup_payload.zip' <<< 22286 1726882777.56408: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.56649: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.56754: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 22286 1726882777.56757: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 22286 1726882777.56833: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 22286 1726882777.56977: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1196626000> import '_typing' # <<< 22286 1726882777.57308: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11965fcef0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11965c3f80> # zipimport: zlib available <<< 22286 1726882777.57331: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available <<< 22286 1726882777.57385: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # <<< 22286 1726882777.57516: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.59049: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.61191: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' <<< 22286 1726882777.61245: stdout chunk (state=3): >>>import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11965ffe60> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 22286 1726882777.61296: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 22286 1726882777.61402: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 22286 1726882777.61418: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' <<< 22286 1726882777.61528: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11966559a0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1196655730> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1196655040> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 22286 1726882777.61610: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1196655790> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1196626a20> import 'atexit' # <<< 22286 1726882777.61753: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11966566c0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1196656840> <<< 22286 1726882777.61784: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # <<< 22286 1726882777.61804: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1196656d50> import 'pwd' # <<< 22286 1726882777.61839: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 22286 1726882777.61869: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 22286 1726882777.61983: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11964bcad0> <<< 22286 1726882777.62041: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11964be780> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 22286 1726882777.62125: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11964bf140> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11964bff80> <<< 22286 1726882777.62164: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 22286 1726882777.62283: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py <<< 22286 1726882777.62286: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 22286 1726882777.63458: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11964c2d80> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11964c2ea0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11964c1040> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11964c6c60> import '_tokenize' # <<< 22286 1726882777.64123: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11964c5730> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11964c5490> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11964c7ce0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11964c1550> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f119650ade0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f119650af60> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f119650cb00> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f119650c8c0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f119650f080> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f119650d1c0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f119651a8a0> <<< 22286 1726882777.64209: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f119650f230> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f119651b710> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f119651b8f0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f119651b290> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f119650b1d0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f119651eb10> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f119651fec0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f119651d2b0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f119651e180> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f119651ce90> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 22286 1726882777.64341: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.64395: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # <<< 22286 1726882777.64546: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 22286 1726882777.64656: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.64777: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.65540: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.66171: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # <<< 22286 1726882777.66197: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 22286 1726882777.66439: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11963a8140> <<< 22286 1726882777.66443: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11963a9670> <<< 22286 1726882777.66541: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1196522990> <<< 22286 1726882777.66558: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # <<< 22286 1726882777.66747: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.66764: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.66996: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11963a9700> # zipimport: zlib available <<< 22286 1726882777.67540: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.68279: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.68572: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.collections' # <<< 22286 1726882777.68585: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.68651: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.68699: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 22286 1726882777.68724: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.69023: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.69088: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 22286 1726882777.69092: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.69172: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 22286 1726882777.69258: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # <<< 22286 1726882777.69277: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.70040: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.70248: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 22286 1726882777.70339: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 22286 1726882777.70367: stdout chunk (state=3): >>>import '_ast' # <<< 22286 1726882777.70506: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11963ab5f0> <<< 22286 1726882777.70624: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.70898: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 22286 1726882777.70902: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 22286 1726882777.71116: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11963b1c70> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11963b25d0> <<< 22286 1726882777.71123: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11963aa5d0> <<< 22286 1726882777.71178: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.71181: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.71394: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # # zipimport: zlib available <<< 22286 1726882777.71398: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 22286 1726882777.71462: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 22286 1726882777.71514: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 22286 1726882777.71601: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11963b1220> <<< 22286 1726882777.71649: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11963b2810> <<< 22286 1726882777.71703: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 22286 1726882777.71707: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.71763: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.71845: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.71924: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.71986: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 22286 1726882777.71991: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 22286 1726882777.72062: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 22286 1726882777.72171: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 22286 1726882777.72173: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 22286 1726882777.72176: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1196446a50> <<< 22286 1726882777.72199: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11963bc7a0> <<< 22286 1726882777.72311: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11963b7b90> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11963b6660> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 22286 1726882777.72315: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.72345: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.72428: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 22286 1726882777.72506: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available <<< 22286 1726882777.72510: stdout chunk (state=3): >>>import 'ansible.modules' # <<< 22286 1726882777.72581: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.72584: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.72671: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 22286 1726882777.72716: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.72728: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.72800: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.72872: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.72891: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available <<< 22286 1726882777.72981: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.73283: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 22286 1726882777.73512: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.73927: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 22286 1726882777.74010: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 22286 1726882777.74055: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 22286 1726882777.74100: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f119644d130> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 22286 1726882777.74149: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 22286 1726882777.74353: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 22286 1726882777.74368: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f119594c0e0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f119594c440> <<< 22286 1726882777.74393: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11963c1160> <<< 22286 1726882777.74416: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11963c0740> <<< 22286 1726882777.74475: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f119644eed0> <<< 22286 1726882777.74488: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f119644ec00> <<< 22286 1726882777.74588: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 22286 1726882777.74609: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py <<< 22286 1726882777.74624: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 22286 1726882777.74642: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py <<< 22286 1726882777.74737: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f119594f3e0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f119594ecc0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f119594ee70> <<< 22286 1726882777.74759: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f119594e120> <<< 22286 1726882777.74848: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 22286 1726882777.74942: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 22286 1726882777.74970: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f119594f530><<< 22286 1726882777.75005: stdout chunk (state=3): >>> <<< 22286 1726882777.75050: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 22286 1726882777.75123: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 22286 1726882777.75244: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so'<<< 22286 1726882777.75264: stdout chunk (state=3): >>> import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11959ba060> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f119594ff80> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f119644edb0> <<< 22286 1726882777.75285: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.timeout' # <<< 22286 1726882777.75324: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # # zipimport: zlib available<<< 22286 1726882777.75428: stdout chunk (state=3): >>> # zipimport: zlib available <<< 22286 1726882777.75456: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other' # # zipimport: zlib available <<< 22286 1726882777.75500: stdout chunk (state=3): >>># zipimport: zlib available<<< 22286 1726882777.75600: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.other.facter' # <<< 22286 1726882777.75638: stdout chunk (state=3): >>># zipimport: zlib available<<< 22286 1726882777.75658: stdout chunk (state=3): >>> <<< 22286 1726882777.75732: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.75925: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 22286 1726882777.75929: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available<<< 22286 1726882777.75944: stdout chunk (state=3): >>> <<< 22286 1726882777.75973: stdout chunk (state=3): >>># zipimport: zlib available<<< 22286 1726882777.76029: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.system.apparmor' # <<< 22286 1726882777.76060: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.76371: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.76374: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available <<< 22286 1726882777.76461: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.76556: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.76648: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.76731: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # <<< 22286 1726882777.76814: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available <<< 22286 1726882777.77697: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.78513: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 22286 1726882777.78552: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.78607: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.78906: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available <<< 22286 1726882777.78944: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available <<< 22286 1726882777.78994: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.79098: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 22286 1726882777.79111: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.79133: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.79190: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 22286 1726882777.79294: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.79324: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # <<< 22286 1726882777.79443: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 22286 1726882777.79547: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 22286 1726882777.79564: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 22286 1726882777.79618: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11959bbd70> <<< 22286 1726882777.79643: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 22286 1726882777.79665: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 22286 1726882777.79877: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11959bae10> import 'ansible.module_utils.facts.system.local' # <<< 22286 1726882777.79892: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.79987: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.80087: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 22286 1726882777.80110: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.80249: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.80408: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available <<< 22286 1726882777.80513: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.80747: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available <<< 22286 1726882777.80771: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 22286 1726882777.80840: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 22286 1726882777.80933: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 22286 1726882777.81036: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 22286 1726882777.81057: stdout chunk (state=3): >>>import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11959ee390> <<< 22286 1726882777.81383: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11959dc6e0> <<< 22286 1726882777.81409: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available <<< 22286 1726882777.81500: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.81591: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 22286 1726882777.81605: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.81739: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.81877: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.82175: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.82363: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available <<< 22286 1726882777.82400: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.82468: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 22286 1726882777.82544: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 22286 1726882777.82702: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 22286 1726882777.82705: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 22286 1726882777.82847: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1195805ee0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1195805e50> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available <<< 22286 1726882777.83127: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.83397: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 22286 1726882777.83401: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.83564: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.83744: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.83794: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.83869: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # <<< 22286 1726882777.83899: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.83912: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.83950: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.84197: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.84450: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 22286 1726882777.84464: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.84672: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.84903: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 22286 1726882777.84906: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.85051: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 22286 1726882777.86066: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.86986: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available <<< 22286 1726882777.87176: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.87347: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 22286 1726882777.87412: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.87515: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.87844: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available <<< 22286 1726882777.88039: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.88240: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 22286 1726882777.88278: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.88295: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available <<< 22286 1726882777.88368: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.88432: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 22286 1726882777.88454: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.88769: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.88772: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.89123: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.89503: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available <<< 22286 1726882777.89569: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.89615: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 22286 1726882777.89630: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.89657: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.89707: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available <<< 22286 1726882777.89825: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.89932: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 22286 1726882777.89977: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.89981: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.90047: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available <<< 22286 1726882777.90108: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.90195: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 22286 1726882777.90240: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.90307: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.90382: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 22286 1726882777.90410: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.91120: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.91430: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 22286 1726882777.91644: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 22286 1726882777.91648: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # <<< 22286 1726882777.91826: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.91866: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # <<< 22286 1726882777.91897: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.92029: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.92196: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available <<< 22286 1726882777.92454: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 22286 1726882777.92456: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.92518: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.92638: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.92755: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # <<< 22286 1726882777.92775: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # <<< 22286 1726882777.92784: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.92858: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.92946: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # <<< 22286 1726882777.92948: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.93313: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.93666: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 22286 1726882777.93684: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.93752: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.93824: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 22286 1726882777.93838: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.93908: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.93978: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 22286 1726882777.94000: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.94129: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.94284: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 22286 1726882777.94583: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 22286 1726882777.94858: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882777.95314: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py <<< 22286 1726882777.95364: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 22286 1726882777.95438: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 22286 1726882777.95457: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11958328a0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1195832f30> <<< 22286 1726882777.95529: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f119582ffb0> <<< 22286 1726882778.13278: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py <<< 22286 1726882778.13283: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' <<< 22286 1726882778.13309: stdout chunk (state=3): >>>import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f119587af90> <<< 22286 1726882778.13322: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py <<< 22286 1726882778.13327: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' <<< 22286 1726882778.13349: stdout chunk (state=3): >>>import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11958796a0> <<< 22286 1726882778.13394: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py <<< 22286 1726882778.13405: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' <<< 22286 1726882778.13462: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' <<< 22286 1726882778.13468: stdout chunk (state=3): >>>import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f119587b8c0> <<< 22286 1726882778.13490: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f119587a930> <<< 22286 1726882778.13744: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 22286 1726882778.40459: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "6.10.9-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 9 02:28:01 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-41-238.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-41-238", "ansible_nodename": "ip-10-31-41-238.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec243b3f63949f99a85dd461938b27f6", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANYijgij1fEhTOf5yay/qzv1+ckF/sTeAcrQU7mSl4JlHlSjJFRS9ZOEcTyZhIM24bmrUmUAXGByisr1fJhHdM1w6H4TV9d8eAGz5dvqRt3OMFXU98TIudAuK2zln4nrfCSz2a6X/3opBJuckX9rZaO0ickijdGATG1zU5j6yse5AAAAFQCB9h5S0fKeFdZzGlNOZp7suEtGLQAAAIBqL8YusJlS5M+t8hqB5XoiVX2JRwxeO45o1F+YDEL8s88gEWv3QxNNB5xqhdMrEbA13n8FJWfZZdvcU7PONunHJRbKJZFHcCdK5TI9eGObNVaZTYNSFhZ2BieAeUf4m7eiHxQI/o71WHee8ehKt8oSXovKXzKzzh6V8adityCM0QAAAIBN41A6QjMEnqm1991CkEko30YVBdWgdcunoDD7NJJoONNTR054WsZIbydxWyYVFa3fC4HcmSYJjXHxuSZuCmzFZYOzQSedVWWiET/kLEvIDxOZEQ44DCsa3zg29Ty97IbNNwLSIOFXoUbWllCQV9qge0q5dQ/J1wTQdymso3DyLQ==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC9Kn//qIm4a6eEMMZR4qpdJnSJr3FLIr3UgGTmVZPamYNdQ29vbdZONWFxDxoVqR69dvgMi7B4CVdaUu6QyBPlMI1hnnTMD7yGuFvf0wDLvk2p5tQf1MwOc8WdJCPqkvcCYTOD4gBf/qOT2LG96u3e6y9NpDSs43WwFzV2YMOpEVnbg+17SPjuOyE07jTJi4gLXbcjXxt9rz8nQMlsQPFysakPATk6pjVZnnTWDcFUSfc1sUdO6IWl/O4jlB/QtP/FkO38YQbSYx1fiZNsk+JP6ZeZ4F0trwlxRemd9P6eEqtA9jVdvSCvJNHgZoMob64uw1c2P8BFaAByky5crE35ggw6pKcQTHTAHhrPBTx12gpwlL4rB+OoysKYhxI8VeW+TYiNWBxF+EUpmcj/QMfOOgNbIEeK+YfNZ606vwhkyjORVqaN3MswYozhtwmAoyxDKaTAYWXo4+d+GqZ7pURKpwdZrI8M7e8Nvd+dwpW3OtrfAqXvFwIrBrivFfWnDE0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBsLvf+TKoJqIfm5H0y7RP9w7PN99SnDATkd0bkTPwuIbQqBA6MAihYQaVCQtnKQCWC09GNZyMQSeayjLONajkY=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIME0e8Dw9KPaZbGCYYNAh3+j3dHxYuGhpELosAmEvhOR", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_local": {}, "ansible_fips": false, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.145 37172 10.31.41.238 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SEL<<< 22286 1726882778.40519: stdout chunk (state=3): >>>INUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.145 37172 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_service_mgr": "systemd", "ansible_is_chroot": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2877, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 840, "free": 2877}, "nocache": {"free": 3464, "used": 253}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec243b3f-6394-9f99-a85d-d461938b27f6", "ansible_product_uuid": "ec243b3f-6394-9f99-a85d-d461938b27f6", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["f92a5a40-e33d-4a6f-8746-997eff27cfbd"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "f92a5a40-e33d-4a6f-8746-997eff27cfbd", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["f92a5a40-e33d-4a6f-8746-997eff27cfbd"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 922, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251211694080, "block_size": 4096, "block_total": 64483404, "block_available": 61330980, "block_used": 3152424, "inode_total": 16384000, "inode_available": 16303841, "inode_used": 80159, "uuid": "f92a5a40-e33d-4a6f-8746-997eff27cfbd"}], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fibre_channel_wwn": [], "ansible_pkg_mgr": "dnf", "ansible_loadavg": {"1m": 0.61767578125, "5m": 0.4580078125, "15m": 0.27294921875}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "39", "second": "38", "epoch": "1726882778", "epoch_int": "1726882778", "date": "2024-09-20", "time": "21:39:38", "iso8601_micro": "2024-09-21T01:39:38.341553Z", "<<< 22286 1726882778.40533: stdout chunk (state=3): >>>iso8601": "2024-09-21T01:39:38Z", "iso8601_basic": "20240920T213938341553", "iso8601_basic_short": "20240920T213938", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0e:39:03:af:ed:a3", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.41.238", "broadcast": "10.31.43.255", "netmask": "255.255.252.0", "network": "10.31.40.0", "prefix": "22"}, "ipv6": [{"address": "fe80::a0b7:fdc4:48e8:7158", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixe<<< 22286 1726882778.40538: stdout chunk (state=3): >>>d]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.40.1", "interface": "eth0", "address": "10.31.41.238", "broadcast": "10.31.43.255", "netmask": "255.255.252.0", "network": "10.31.40.0", "prefix": "22", "macaddress": "0e:39:03:af:ed:a3", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.41.238"], "ansible_all_ipv6_addresses": ["fe80::a0b7:fdc4:48e8:7158"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.41.238", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::a0b7:fdc4:48e8:7158"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_lsb": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_iscsi_iqn": "", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_hostnqn": "", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 22286 1726882778.41252: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 22286 1726882778.41304: stdout chunk (state=3): >>># clear sys.path_hooks <<< 22286 1726882778.41310: stdout chunk (state=3): >>># clear builtins._ <<< 22286 1726882778.41313: stdout chunk (state=3): >>># clear sys.path # clear sys.argv<<< 22286 1726882778.41319: stdout chunk (state=3): >>> # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value<<< 22286 1726882778.41352: stdout chunk (state=3): >>> # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp<<< 22286 1726882778.41380: stdout chunk (state=3): >>> # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal<<< 22286 1726882778.41408: stdout chunk (state=3): >>> # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath<<< 22286 1726882778.41427: stdout chunk (state=3): >>> # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site<<< 22286 1726882778.41449: stdout chunk (state=3): >>> # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib<<< 22286 1726882778.41458: stdout chunk (state=3): >>> # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools<<< 22286 1726882778.41480: stdout chunk (state=3): >>> # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg<<< 22286 1726882778.41494: stdout chunk (state=3): >>> # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib<<< 22286 1726882778.41524: stdout chunk (state=3): >>> # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2<<< 22286 1726882778.41547: stdout chunk (state=3): >>> # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random<<< 22286 1726882778.41557: stdout chunk (state=3): >>> # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset<<< 22286 1726882778.41584: stdout chunk (state=3): >>> # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib<<< 22286 1726882778.41593: stdout chunk (state=3): >>> # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path<<< 22286 1726882778.41614: stdout chunk (state=3): >>> # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil<<< 22286 1726882778.41632: stdout chunk (state=3): >>> # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__<<< 22286 1726882778.41652: stdout chunk (state=3): >>> # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit<<< 22286 1726882778.41665: stdout chunk (state=3): >>> # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors <<< 22286 1726882778.41686: stdout chunk (state=3): >>># cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize <<< 22286 1726882778.41768: stdout chunk (state=3): >>># cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux <<< 22286 1726882778.41784: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file<<< 22286 1726882778.41790: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.common.process <<< 22286 1726882778.41814: stdout chunk (state=3): >>># destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse <<< 22286 1726882778.41825: stdout chunk (state=3): >>># cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic<<< 22286 1726882778.41845: stdout chunk (state=3): >>> # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle<<< 22286 1726882778.41869: stdout chunk (state=3): >>> # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq<<< 22286 1726882778.41886: stdout chunk (state=3): >>> # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool<<< 22286 1726882778.41893: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system <<< 22286 1726882778.41919: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime<<< 22286 1726882778.41935: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local<<< 22286 1726882778.41958: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr<<< 22286 1726882778.41978: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl<<< 22286 1726882778.41996: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd<<< 22286 1726882778.42021: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly<<< 22286 1726882778.42040: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme<<< 22286 1726882778.42063: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl<<< 22286 1726882778.42076: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos<<< 22286 1726882778.42096: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other<<< 22286 1726882778.42102: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline<<< 22286 1726882778.42127: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr <<< 22286 1726882778.42144: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base<<< 22286 1726882778.42169: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd<<< 22286 1726882778.42198: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn<<< 22286 1726882778.42208: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd <<< 22286 1726882778.42224: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly<<< 22286 1726882778.42248: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata<<< 22286 1726882778.42438: stdout chunk (state=3): >>> # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 22286 1726882778.42776: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 22286 1726882778.42806: stdout chunk (state=3): >>># destroy importlib.machinery <<< 22286 1726882778.42821: stdout chunk (state=3): >>># destroy importlib._abc <<< 22286 1726882778.42826: stdout chunk (state=3): >>># destroy importlib.util <<< 22286 1726882778.42873: stdout chunk (state=3): >>># destroy _bz2 <<< 22286 1726882778.42890: stdout chunk (state=3): >>># destroy _compression <<< 22286 1726882778.42909: stdout chunk (state=3): >>># destroy _lzma <<< 22286 1726882778.42935: stdout chunk (state=3): >>># destroy binascii <<< 22286 1726882778.42952: stdout chunk (state=3): >>># destroy zlib <<< 22286 1726882778.42968: stdout chunk (state=3): >>># destroy bz2 # destroy lzma <<< 22286 1726882778.42978: stdout chunk (state=3): >>># destroy zipfile._path <<< 22286 1726882778.43008: stdout chunk (state=3): >>># destroy zipfile <<< 22286 1726882778.43023: stdout chunk (state=3): >>># destroy pathlib<<< 22286 1726882778.43042: stdout chunk (state=3): >>> # destroy zipfile._path.glob <<< 22286 1726882778.43046: stdout chunk (state=3): >>># destroy ipaddress <<< 22286 1726882778.43098: stdout chunk (state=3): >>># destroy ntpath <<< 22286 1726882778.43124: stdout chunk (state=3): >>># destroy importlib <<< 22286 1726882778.43145: stdout chunk (state=3): >>># destroy zipimport # destroy __main__<<< 22286 1726882778.43240: stdout chunk (state=3): >>> # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid <<< 22286 1726882778.43282: stdout chunk (state=3): >>># destroy _hashlib <<< 22286 1726882778.43295: stdout chunk (state=3): >>># destroy _blake2 <<< 22286 1726882778.43312: stdout chunk (state=3): >>># destroy selinux # destroy shutil <<< 22286 1726882778.43353: stdout chunk (state=3): >>># destroy distro<<< 22286 1726882778.43358: stdout chunk (state=3): >>> # destroy distro.distro<<< 22286 1726882778.43369: stdout chunk (state=3): >>> # destroy argparse<<< 22286 1726882778.43384: stdout chunk (state=3): >>> # destroy logging <<< 22286 1726882778.43443: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors<<< 22286 1726882778.43451: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues<<< 22286 1726882778.43470: stdout chunk (state=3): >>> # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal <<< 22286 1726882778.43500: stdout chunk (state=3): >>># destroy pickle # destroy _compat_pickle # destroy _pickle<<< 22286 1726882778.43510: stdout chunk (state=3): >>> <<< 22286 1726882778.43526: stdout chunk (state=3): >>># destroy queue # destroy _heapq<<< 22286 1726882778.43547: stdout chunk (state=3): >>> # destroy _queue <<< 22286 1726882778.43550: stdout chunk (state=3): >>># destroy multiprocessing.reduction # destroy selectors <<< 22286 1726882778.43599: stdout chunk (state=3): >>># destroy shlex # destroy fcntl # destroy datetime<<< 22286 1726882778.43602: stdout chunk (state=3): >>> # destroy subprocess<<< 22286 1726882778.43621: stdout chunk (state=3): >>> # destroy base64<<< 22286 1726882778.43662: stdout chunk (state=3): >>> # destroy _ssl <<< 22286 1726882778.43698: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux<<< 22286 1726882778.43708: stdout chunk (state=3): >>> # destroy getpass<<< 22286 1726882778.43715: stdout chunk (state=3): >>> # destroy pwd<<< 22286 1726882778.43742: stdout chunk (state=3): >>> # destroy termios <<< 22286 1726882778.43749: stdout chunk (state=3): >>># destroy json<<< 22286 1726882778.43786: stdout chunk (state=3): >>> # destroy socket<<< 22286 1726882778.43795: stdout chunk (state=3): >>> # destroy struct <<< 22286 1726882778.43810: stdout chunk (state=3): >>># destroy glob<<< 22286 1726882778.43828: stdout chunk (state=3): >>> # destroy fnmatch<<< 22286 1726882778.43836: stdout chunk (state=3): >>> # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector<<< 22286 1726882778.43860: stdout chunk (state=3): >>> # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process<<< 22286 1726882778.43886: stdout chunk (state=3): >>> # destroy multiprocessing.util # destroy _multiprocessing # destroy array<<< 22286 1726882778.43953: stdout chunk (state=3): >>> # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna<<< 22286 1726882778.43979: stdout chunk (state=3): >>> # destroy stringprep<<< 22286 1726882778.43983: stdout chunk (state=3): >>> # cleanup[3] wiping configparser<<< 22286 1726882778.44002: stdout chunk (state=3): >>> # cleanup[3] wiping selinux._selinux<<< 22286 1726882778.44018: stdout chunk (state=3): >>> # cleanup[3] wiping ctypes._endian <<< 22286 1726882778.44033: stdout chunk (state=3): >>># cleanup[3] wiping _ctypes <<< 22286 1726882778.44046: stdout chunk (state=3): >>># cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser<<< 22286 1726882778.44065: stdout chunk (state=3): >>> # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128<<< 22286 1726882778.44074: stdout chunk (state=3): >>> # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal<<< 22286 1726882778.44090: stdout chunk (state=3): >>> # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback<<< 22286 1726882778.44106: stdout chunk (state=3): >>> # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize<<< 22286 1726882778.44118: stdout chunk (state=3): >>> # cleanup[3] wiping _tokenize<<< 22286 1726882778.44138: stdout chunk (state=3): >>> # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing<<< 22286 1726882778.44157: stdout chunk (state=3): >>> # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading<<< 22286 1726882778.44171: stdout chunk (state=3): >>> # cleanup[3] wiping weakref<<< 22286 1726882778.44182: stdout chunk (state=3): >>> # cleanup[3] wiping _sha2 # cleanup[3] wiping _random<<< 22286 1726882778.44199: stdout chunk (state=3): >>> # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings<<< 22286 1726882778.44214: stdout chunk (state=3): >>> # cleanup[3] wiping importlib._bootstrap_external<<< 22286 1726882778.44237: stdout chunk (state=3): >>> # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct<<< 22286 1726882778.44241: stdout chunk (state=3): >>> # cleanup[3] wiping re # destroy re._constants # destroy re._casefix<<< 22286 1726882778.44258: stdout chunk (state=3): >>> # destroy re._compiler # destroy enum<<< 22286 1726882778.44263: stdout chunk (state=3): >>> # cleanup[3] wiping copyreg<<< 22286 1726882778.44276: stdout chunk (state=3): >>> # cleanup[3] wiping re._parser # cleanup[3] wiping _sre<<< 22286 1726882778.44283: stdout chunk (state=3): >>> # cleanup[3] wiping functools<<< 22286 1726882778.44303: stdout chunk (state=3): >>> # cleanup[3] wiping _functools<<< 22286 1726882778.44306: stdout chunk (state=3): >>> # cleanup[3] wiping collections # destroy _collections_abc<<< 22286 1726882778.44332: stdout chunk (state=3): >>> # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools<<< 22286 1726882778.44340: stdout chunk (state=3): >>> # cleanup[3] wiping operator # cleanup[3] wiping _operator<<< 22286 1726882778.44360: stdout chunk (state=3): >>> # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig<<< 22286 1726882778.44364: stdout chunk (state=3): >>> # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath<<< 22286 1726882778.44386: stdout chunk (state=3): >>> # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc<<< 22286 1726882778.44409: stdout chunk (state=3): >>> # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs<<< 22286 1726882778.44421: stdout chunk (state=3): >>> # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external<<< 22286 1726882778.44453: stdout chunk (state=3): >>> # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings <<< 22286 1726882778.44467: stdout chunk (state=3): >>># cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys<<< 22286 1726882778.44486: stdout chunk (state=3): >>> # cleanup[3] wiping builtins <<< 22286 1726882778.44507: stdout chunk (state=3): >>># destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader <<< 22286 1726882778.44536: stdout chunk (state=3): >>># destroy systemd._journal # destroy _datetime <<< 22286 1726882778.44747: stdout chunk (state=3): >>># destroy sys.monitoring<<< 22286 1726882778.44766: stdout chunk (state=3): >>> # destroy _socket <<< 22286 1726882778.44821: stdout chunk (state=3): >>># destroy _collections # destroy platform<<< 22286 1726882778.44828: stdout chunk (state=3): >>> <<< 22286 1726882778.44846: stdout chunk (state=3): >>># destroy _uuid<<< 22286 1726882778.44853: stdout chunk (state=3): >>> # destroy stat # destroy genericpath<<< 22286 1726882778.44874: stdout chunk (state=3): >>> # destroy re._parser # destroy tokenize<<< 22286 1726882778.44911: stdout chunk (state=3): >>> # destroy ansible.module_utils.six.moves.urllib<<< 22286 1726882778.44929: stdout chunk (state=3): >>> # destroy copyreg<<< 22286 1726882778.44939: stdout chunk (state=3): >>> # destroy contextlib <<< 22286 1726882778.44988: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize<<< 22286 1726882778.45002: stdout chunk (state=3): >>> # destroy ansible.module_utils.six.moves.urllib_parse<<< 22286 1726882778.45012: stdout chunk (state=3): >>> # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response <<< 22286 1726882778.45045: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 22286 1726882778.45058: stdout chunk (state=3): >>># destroy _frozen_importlib_external <<< 22286 1726882778.45065: stdout chunk (state=3): >>># destroy _imp # destroy _io <<< 22286 1726882778.45104: stdout chunk (state=3): >>># destroy marshal # clear sys.meta_path<<< 22286 1726882778.45124: stdout chunk (state=3): >>> # clear sys.modules<<< 22286 1726882778.45127: stdout chunk (state=3): >>> # destroy _frozen_importlib<<< 22286 1726882778.45245: stdout chunk (state=3): >>> <<< 22286 1726882778.45272: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases<<< 22286 1726882778.45278: stdout chunk (state=3): >>> # destroy encodings.utf_8 <<< 22286 1726882778.45314: stdout chunk (state=3): >>># destroy encodings.utf_8_sig # destroy encodings.cp437 <<< 22286 1726882778.45320: stdout chunk (state=3): >>># destroy encodings.idna # destroy _codecs <<< 22286 1726882778.45369: stdout chunk (state=3): >>># destroy io # destroy traceback<<< 22286 1726882778.45377: stdout chunk (state=3): >>> <<< 22286 1726882778.45381: stdout chunk (state=3): >>># destroy warnings # destroy weakref <<< 22286 1726882778.45403: stdout chunk (state=3): >>># destroy collections # destroy threading <<< 22286 1726882778.45417: stdout chunk (state=3): >>># destroy atexit # destroy _warnings<<< 22286 1726882778.45436: stdout chunk (state=3): >>> # destroy math # destroy _bisect # destroy time<<< 22286 1726882778.45480: stdout chunk (state=3): >>> # destroy _random <<< 22286 1726882778.45488: stdout chunk (state=3): >>># destroy _weakref <<< 22286 1726882778.45523: stdout chunk (state=3): >>># destroy _operator<<< 22286 1726882778.45544: stdout chunk (state=3): >>> # destroy _sha2 # destroy _sre # destroy _string<<< 22286 1726882778.45571: stdout chunk (state=3): >>> # destroy re # destroy itertools<<< 22286 1726882778.45580: stdout chunk (state=3): >>> <<< 22286 1726882778.45600: stdout chunk (state=3): >>># destroy _abc # destroy posix<<< 22286 1726882778.45608: stdout chunk (state=3): >>> # destroy _functools # destroy builtins # destroy _thread<<< 22286 1726882778.45635: stdout chunk (state=3): >>> # clear sys.audit hooks<<< 22286 1726882778.45643: stdout chunk (state=3): >>> <<< 22286 1726882778.46284: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. <<< 22286 1726882778.46341: stderr chunk (state=3): >>><<< 22286 1726882778.46345: stdout chunk (state=3): >>><<< 22286 1726882778.46465: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1196b2c530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1196afbb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1196b2eab0> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11968dd160> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11968ddfd0> import 'site' # Python 3.12.5 (main, Aug 7 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f119691bda0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f119691bfe0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11969537d0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1196953e60> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1196933a70> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1196931190> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1196918f50> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11969776b0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11969762d0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1196932030> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f119691ae40> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11969a8770> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11969181d0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11969a8c20> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11969a8ad0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11969a8e90> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1196916d20> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11969a9520> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11969a91f0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11969aa420> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11969c4620> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11969c5d60> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11969c6c60> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11969c72c0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11969c61b0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11969c7d40> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11969c7470> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11969aa390> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11966cfcb0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11966f8770> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11966f84d0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11966f87a0> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11966f8980> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11966cde50> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11966fa000> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11966f8c80> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11969aab40> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f119672a360> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1196742510> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f119677b2f0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11967a1a90> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f119677b410> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11967431a0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11965c03e0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1196741550> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11966faf00> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f11965c05c0> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_n635dq3f/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1196626000> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11965fcef0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11965c3f80> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11965ffe60> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11966559a0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1196655730> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1196655040> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1196655790> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1196626a20> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11966566c0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1196656840> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1196656d50> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11964bcad0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11964be780> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11964bf140> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11964bff80> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11964c2d80> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11964c2ea0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11964c1040> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11964c6c60> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11964c5730> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11964c5490> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11964c7ce0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11964c1550> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f119650ade0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f119650af60> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f119650cb00> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f119650c8c0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f119650f080> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f119650d1c0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f119651a8a0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f119650f230> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f119651b710> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f119651b8f0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f119651b290> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f119650b1d0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f119651eb10> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f119651fec0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f119651d2b0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f119651e180> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f119651ce90> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11963a8140> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11963a9670> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1196522990> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11963a9700> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11963ab5f0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11963b1c70> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11963b25d0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11963aa5d0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11963b1220> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11963b2810> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1196446a50> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11963bc7a0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11963b7b90> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11963b6660> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f119644d130> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f119594c0e0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f119594c440> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11963c1160> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11963c0740> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f119644eed0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f119644ec00> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f119594f3e0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f119594ecc0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f119594ee70> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f119594e120> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f119594f530> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11959ba060> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f119594ff80> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f119644edb0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11959bbd70> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11959bae10> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11959ee390> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11959dc6e0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1195805ee0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1195805e50> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f11958328a0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1195832f30> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f119582ffb0> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f119587af90> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f11958796a0> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f119587b8c0> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f119587a930> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "6.10.9-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 9 02:28:01 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-41-238.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-41-238", "ansible_nodename": "ip-10-31-41-238.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec243b3f63949f99a85dd461938b27f6", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANYijgij1fEhTOf5yay/qzv1+ckF/sTeAcrQU7mSl4JlHlSjJFRS9ZOEcTyZhIM24bmrUmUAXGByisr1fJhHdM1w6H4TV9d8eAGz5dvqRt3OMFXU98TIudAuK2zln4nrfCSz2a6X/3opBJuckX9rZaO0ickijdGATG1zU5j6yse5AAAAFQCB9h5S0fKeFdZzGlNOZp7suEtGLQAAAIBqL8YusJlS5M+t8hqB5XoiVX2JRwxeO45o1F+YDEL8s88gEWv3QxNNB5xqhdMrEbA13n8FJWfZZdvcU7PONunHJRbKJZFHcCdK5TI9eGObNVaZTYNSFhZ2BieAeUf4m7eiHxQI/o71WHee8ehKt8oSXovKXzKzzh6V8adityCM0QAAAIBN41A6QjMEnqm1991CkEko30YVBdWgdcunoDD7NJJoONNTR054WsZIbydxWyYVFa3fC4HcmSYJjXHxuSZuCmzFZYOzQSedVWWiET/kLEvIDxOZEQ44DCsa3zg29Ty97IbNNwLSIOFXoUbWllCQV9qge0q5dQ/J1wTQdymso3DyLQ==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC9Kn//qIm4a6eEMMZR4qpdJnSJr3FLIr3UgGTmVZPamYNdQ29vbdZONWFxDxoVqR69dvgMi7B4CVdaUu6QyBPlMI1hnnTMD7yGuFvf0wDLvk2p5tQf1MwOc8WdJCPqkvcCYTOD4gBf/qOT2LG96u3e6y9NpDSs43WwFzV2YMOpEVnbg+17SPjuOyE07jTJi4gLXbcjXxt9rz8nQMlsQPFysakPATk6pjVZnnTWDcFUSfc1sUdO6IWl/O4jlB/QtP/FkO38YQbSYx1fiZNsk+JP6ZeZ4F0trwlxRemd9P6eEqtA9jVdvSCvJNHgZoMob64uw1c2P8BFaAByky5crE35ggw6pKcQTHTAHhrPBTx12gpwlL4rB+OoysKYhxI8VeW+TYiNWBxF+EUpmcj/QMfOOgNbIEeK+YfNZ606vwhkyjORVqaN3MswYozhtwmAoyxDKaTAYWXo4+d+GqZ7pURKpwdZrI8M7e8Nvd+dwpW3OtrfAqXvFwIrBrivFfWnDE0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBsLvf+TKoJqIfm5H0y7RP9w7PN99SnDATkd0bkTPwuIbQqBA6MAihYQaVCQtnKQCWC09GNZyMQSeayjLONajkY=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIME0e8Dw9KPaZbGCYYNAh3+j3dHxYuGhpELosAmEvhOR", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_local": {}, "ansible_fips": false, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.145 37172 10.31.41.238 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.145 37172 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_service_mgr": "systemd", "ansible_is_chroot": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2877, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 840, "free": 2877}, "nocache": {"free": 3464, "used": 253}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec243b3f-6394-9f99-a85d-d461938b27f6", "ansible_product_uuid": "ec243b3f-6394-9f99-a85d-d461938b27f6", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["f92a5a40-e33d-4a6f-8746-997eff27cfbd"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "f92a5a40-e33d-4a6f-8746-997eff27cfbd", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["f92a5a40-e33d-4a6f-8746-997eff27cfbd"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 922, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251211694080, "block_size": 4096, "block_total": 64483404, "block_available": 61330980, "block_used": 3152424, "inode_total": 16384000, "inode_available": 16303841, "inode_used": 80159, "uuid": "f92a5a40-e33d-4a6f-8746-997eff27cfbd"}], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fibre_channel_wwn": [], "ansible_pkg_mgr": "dnf", "ansible_loadavg": {"1m": 0.61767578125, "5m": 0.4580078125, "15m": 0.27294921875}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "39", "second": "38", "epoch": "1726882778", "epoch_int": "1726882778", "date": "2024-09-20", "time": "21:39:38", "iso8601_micro": "2024-09-21T01:39:38.341553Z", "iso8601": "2024-09-21T01:39:38Z", "iso8601_basic": "20240920T213938341553", "iso8601_basic_short": "20240920T213938", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0e:39:03:af:ed:a3", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.41.238", "broadcast": "10.31.43.255", "netmask": "255.255.252.0", "network": "10.31.40.0", "prefix": "22"}, "ipv6": [{"address": "fe80::a0b7:fdc4:48e8:7158", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.40.1", "interface": "eth0", "address": "10.31.41.238", "broadcast": "10.31.43.255", "netmask": "255.255.252.0", "network": "10.31.40.0", "prefix": "22", "macaddress": "0e:39:03:af:ed:a3", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.41.238"], "ansible_all_ipv6_addresses": ["fe80::a0b7:fdc4:48e8:7158"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.41.238", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::a0b7:fdc4:48e8:7158"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_lsb": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_iscsi_iqn": "", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_hostnqn": "", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed_node3 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 22286 1726882778.47857: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882776.6760824-22308-113012573746923/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22286 1726882778.47882: _low_level_execute_command(): starting 22286 1726882778.47886: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882776.6760824-22308-113012573746923/ > /dev/null 2>&1 && sleep 0' 22286 1726882778.48323: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882778.48362: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22286 1726882778.48365: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found <<< 22286 1726882778.48367: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882778.48370: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882778.48376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882778.48425: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882778.48428: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882778.48556: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882778.51376: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882778.51413: stderr chunk (state=3): >>><<< 22286 1726882778.51417: stdout chunk (state=3): >>><<< 22286 1726882778.51430: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882778.51439: handler run complete 22286 1726882778.51555: variable 'ansible_facts' from source: unknown 22286 1726882778.51644: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882778.51946: variable 'ansible_facts' from source: unknown 22286 1726882778.52018: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882778.52141: attempt loop complete, returning result 22286 1726882778.52152: _execute() done 22286 1726882778.52157: dumping result to json 22286 1726882778.52180: done dumping result, returning 22286 1726882778.52187: done running TaskExecutor() for managed_node3/TASK: Gathering Facts [0affe814-3a2d-a75d-4836-0000000000b9] 22286 1726882778.52192: sending task result for task 0affe814-3a2d-a75d-4836-0000000000b9 22286 1726882778.52856: done sending task result for task 0affe814-3a2d-a75d-4836-0000000000b9 ok: [managed_node3] 22286 1726882778.52870: WORKER PROCESS EXITING 22286 1726882778.52981: no more pending results, returning what we have 22286 1726882778.52988: results queue empty 22286 1726882778.52989: checking for any_errors_fatal 22286 1726882778.52990: done checking for any_errors_fatal 22286 1726882778.52990: checking for max_fail_percentage 22286 1726882778.52991: done checking for max_fail_percentage 22286 1726882778.52992: checking to see if all hosts have failed and the running result is not ok 22286 1726882778.52993: done checking to see if all hosts have failed 22286 1726882778.52993: getting the remaining hosts for this loop 22286 1726882778.52995: done getting the remaining hosts for this loop 22286 1726882778.52997: getting the next task for host managed_node3 22286 1726882778.53002: done getting next task for host managed_node3 22286 1726882778.53004: ^ task is: TASK: meta (flush_handlers) 22286 1726882778.53005: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882778.53008: getting variables 22286 1726882778.53009: in VariableManager get_vars() 22286 1726882778.53026: Calling all_inventory to load vars for managed_node3 22286 1726882778.53028: Calling groups_inventory to load vars for managed_node3 22286 1726882778.53030: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882778.53040: Calling all_plugins_play to load vars for managed_node3 22286 1726882778.53042: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882778.53044: Calling groups_plugins_play to load vars for managed_node3 22286 1726882778.53205: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882778.53386: done with get_vars() 22286 1726882778.53395: done getting variables 22286 1726882778.53453: in VariableManager get_vars() 22286 1726882778.53461: Calling all_inventory to load vars for managed_node3 22286 1726882778.53463: Calling groups_inventory to load vars for managed_node3 22286 1726882778.53465: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882778.53468: Calling all_plugins_play to load vars for managed_node3 22286 1726882778.53470: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882778.53472: Calling groups_plugins_play to load vars for managed_node3 22286 1726882778.53596: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882778.53769: done with get_vars() 22286 1726882778.53782: done queuing things up, now waiting for results queue to drain 22286 1726882778.53784: results queue empty 22286 1726882778.53784: checking for any_errors_fatal 22286 1726882778.53786: done checking for any_errors_fatal 22286 1726882778.53787: checking for max_fail_percentage 22286 1726882778.53787: done checking for max_fail_percentage 22286 1726882778.53791: checking to see if all hosts have failed and the running result is not ok 22286 1726882778.53791: done checking to see if all hosts have failed 22286 1726882778.53792: getting the remaining hosts for this loop 22286 1726882778.53793: done getting the remaining hosts for this loop 22286 1726882778.53795: getting the next task for host managed_node3 22286 1726882778.53798: done getting next task for host managed_node3 22286 1726882778.53800: ^ task is: TASK: Include the task 'el_repo_setup.yml' 22286 1726882778.53801: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882778.53803: getting variables 22286 1726882778.53803: in VariableManager get_vars() 22286 1726882778.53809: Calling all_inventory to load vars for managed_node3 22286 1726882778.53811: Calling groups_inventory to load vars for managed_node3 22286 1726882778.53813: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882778.53816: Calling all_plugins_play to load vars for managed_node3 22286 1726882778.53818: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882778.53820: Calling groups_plugins_play to load vars for managed_node3 22286 1726882778.53944: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882778.54131: done with get_vars() 22286 1726882778.54139: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_nm.yml:11 Friday 20 September 2024 21:39:38 -0400 (0:00:01.919) 0:00:01.935 ****** 22286 1726882778.54201: entering _queue_task() for managed_node3/include_tasks 22286 1726882778.54203: Creating lock for include_tasks 22286 1726882778.54426: worker is 1 (out of 1 available) 22286 1726882778.54440: exiting _queue_task() for managed_node3/include_tasks 22286 1726882778.54452: done queuing things up, now waiting for results queue to drain 22286 1726882778.54454: waiting for pending results... 22286 1726882778.54604: running TaskExecutor() for managed_node3/TASK: Include the task 'el_repo_setup.yml' 22286 1726882778.54668: in run() - task 0affe814-3a2d-a75d-4836-000000000006 22286 1726882778.54686: variable 'ansible_search_path' from source: unknown 22286 1726882778.54716: calling self._execute() 22286 1726882778.54772: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882778.54780: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882778.54791: variable 'omit' from source: magic vars 22286 1726882778.54879: _execute() done 22286 1726882778.54884: dumping result to json 22286 1726882778.54888: done dumping result, returning 22286 1726882778.54891: done running TaskExecutor() for managed_node3/TASK: Include the task 'el_repo_setup.yml' [0affe814-3a2d-a75d-4836-000000000006] 22286 1726882778.54899: sending task result for task 0affe814-3a2d-a75d-4836-000000000006 22286 1726882778.54999: done sending task result for task 0affe814-3a2d-a75d-4836-000000000006 22286 1726882778.55004: WORKER PROCESS EXITING 22286 1726882778.55050: no more pending results, returning what we have 22286 1726882778.55054: in VariableManager get_vars() 22286 1726882778.55084: Calling all_inventory to load vars for managed_node3 22286 1726882778.55087: Calling groups_inventory to load vars for managed_node3 22286 1726882778.55090: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882778.55100: Calling all_plugins_play to load vars for managed_node3 22286 1726882778.55103: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882778.55106: Calling groups_plugins_play to load vars for managed_node3 22286 1726882778.55255: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882778.55424: done with get_vars() 22286 1726882778.55430: variable 'ansible_search_path' from source: unknown 22286 1726882778.55442: we have included files to process 22286 1726882778.55443: generating all_blocks data 22286 1726882778.55444: done generating all_blocks data 22286 1726882778.55444: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 22286 1726882778.55446: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 22286 1726882778.55449: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 22286 1726882778.56003: in VariableManager get_vars() 22286 1726882778.56015: done with get_vars() 22286 1726882778.56024: done processing included file 22286 1726882778.56026: iterating over new_blocks loaded from include file 22286 1726882778.56027: in VariableManager get_vars() 22286 1726882778.56035: done with get_vars() 22286 1726882778.56037: filtering new block on tags 22286 1726882778.56047: done filtering new block on tags 22286 1726882778.56049: in VariableManager get_vars() 22286 1726882778.56056: done with get_vars() 22286 1726882778.56057: filtering new block on tags 22286 1726882778.56068: done filtering new block on tags 22286 1726882778.56069: in VariableManager get_vars() 22286 1726882778.56078: done with get_vars() 22286 1726882778.56079: filtering new block on tags 22286 1726882778.56090: done filtering new block on tags 22286 1726882778.56091: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node3 22286 1726882778.56095: extending task lists for all hosts with included blocks 22286 1726882778.56137: done extending task lists 22286 1726882778.56138: done processing included files 22286 1726882778.56139: results queue empty 22286 1726882778.56139: checking for any_errors_fatal 22286 1726882778.56140: done checking for any_errors_fatal 22286 1726882778.56141: checking for max_fail_percentage 22286 1726882778.56142: done checking for max_fail_percentage 22286 1726882778.56142: checking to see if all hosts have failed and the running result is not ok 22286 1726882778.56143: done checking to see if all hosts have failed 22286 1726882778.56143: getting the remaining hosts for this loop 22286 1726882778.56144: done getting the remaining hosts for this loop 22286 1726882778.56146: getting the next task for host managed_node3 22286 1726882778.56149: done getting next task for host managed_node3 22286 1726882778.56151: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 22286 1726882778.56152: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882778.56154: getting variables 22286 1726882778.56154: in VariableManager get_vars() 22286 1726882778.56160: Calling all_inventory to load vars for managed_node3 22286 1726882778.56162: Calling groups_inventory to load vars for managed_node3 22286 1726882778.56163: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882778.56167: Calling all_plugins_play to load vars for managed_node3 22286 1726882778.56169: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882778.56171: Calling groups_plugins_play to load vars for managed_node3 22286 1726882778.56307: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882778.56483: done with get_vars() 22286 1726882778.56491: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Friday 20 September 2024 21:39:38 -0400 (0:00:00.023) 0:00:01.958 ****** 22286 1726882778.56543: entering _queue_task() for managed_node3/setup 22286 1726882778.56730: worker is 1 (out of 1 available) 22286 1726882778.56744: exiting _queue_task() for managed_node3/setup 22286 1726882778.56756: done queuing things up, now waiting for results queue to drain 22286 1726882778.56758: waiting for pending results... 22286 1726882778.56910: running TaskExecutor() for managed_node3/TASK: Gather the minimum subset of ansible_facts required by the network role test 22286 1726882778.56982: in run() - task 0affe814-3a2d-a75d-4836-0000000000ca 22286 1726882778.56992: variable 'ansible_search_path' from source: unknown 22286 1726882778.56997: variable 'ansible_search_path' from source: unknown 22286 1726882778.57027: calling self._execute() 22286 1726882778.57084: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882778.57090: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882778.57102: variable 'omit' from source: magic vars 22286 1726882778.57519: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22286 1726882778.59796: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22286 1726882778.59849: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22286 1726882778.59883: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22286 1726882778.59924: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22286 1726882778.59948: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22286 1726882778.60017: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22286 1726882778.60042: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22286 1726882778.60063: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882778.60104: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22286 1726882778.60118: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22286 1726882778.60260: variable 'ansible_facts' from source: unknown 22286 1726882778.60324: variable 'network_test_required_facts' from source: task vars 22286 1726882778.60358: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 22286 1726882778.60364: variable 'omit' from source: magic vars 22286 1726882778.60395: variable 'omit' from source: magic vars 22286 1726882778.60425: variable 'omit' from source: magic vars 22286 1726882778.60448: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22286 1726882778.60472: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22286 1726882778.60490: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22286 1726882778.60505: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882778.60516: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882778.60552: variable 'inventory_hostname' from source: host vars for 'managed_node3' 22286 1726882778.60556: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882778.60558: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882778.60645: Set connection var ansible_shell_executable to /bin/sh 22286 1726882778.60655: Set connection var ansible_module_compression to ZIP_DEFLATED 22286 1726882778.60658: Set connection var ansible_connection to ssh 22286 1726882778.60661: Set connection var ansible_shell_type to sh 22286 1726882778.60741: Set connection var ansible_timeout to 10 22286 1726882778.60745: Set connection var ansible_pipelining to False 22286 1726882778.60748: variable 'ansible_shell_executable' from source: unknown 22286 1726882778.60751: variable 'ansible_connection' from source: unknown 22286 1726882778.60753: variable 'ansible_module_compression' from source: unknown 22286 1726882778.60756: variable 'ansible_shell_type' from source: unknown 22286 1726882778.60758: variable 'ansible_shell_executable' from source: unknown 22286 1726882778.60760: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882778.60763: variable 'ansible_pipelining' from source: unknown 22286 1726882778.60765: variable 'ansible_timeout' from source: unknown 22286 1726882778.60767: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882778.60839: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 22286 1726882778.60850: variable 'omit' from source: magic vars 22286 1726882778.60853: starting attempt loop 22286 1726882778.60856: running the handler 22286 1726882778.60868: _low_level_execute_command(): starting 22286 1726882778.60875: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22286 1726882778.61913: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882778.62022: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882778.62025: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882778.62028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22286 1726882778.62030: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 <<< 22286 1726882778.62033: stderr chunk (state=3): >>>debug2: match not found <<< 22286 1726882778.62037: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882778.62040: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 22286 1726882778.62042: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.41.238 is address <<< 22286 1726882778.62196: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 22286 1726882778.62439: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882778.62443: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882778.62446: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882778.64223: stdout chunk (state=3): >>>/root <<< 22286 1726882778.64273: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882778.64503: stderr chunk (state=3): >>><<< 22286 1726882778.64515: stdout chunk (state=3): >>><<< 22286 1726882778.64538: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882778.64560: _low_level_execute_command(): starting 22286 1726882778.64572: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882778.6454618-22364-92194144137739 `" && echo ansible-tmp-1726882778.6454618-22364-92194144137739="` echo /root/.ansible/tmp/ansible-tmp-1726882778.6454618-22364-92194144137739 `" ) && sleep 0' 22286 1726882778.65498: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882778.65502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found <<< 22286 1726882778.65505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration <<< 22286 1726882778.65507: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found <<< 22286 1726882778.65510: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882778.65617: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882778.65770: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882778.68623: stdout chunk (state=3): >>>ansible-tmp-1726882778.6454618-22364-92194144137739=/root/.ansible/tmp/ansible-tmp-1726882778.6454618-22364-92194144137739 <<< 22286 1726882778.68815: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882778.68820: stdout chunk (state=3): >>><<< 22286 1726882778.68826: stderr chunk (state=3): >>><<< 22286 1726882778.69044: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882778.6454618-22364-92194144137739=/root/.ansible/tmp/ansible-tmp-1726882778.6454618-22364-92194144137739 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882778.69048: variable 'ansible_module_compression' from source: unknown 22286 1726882778.69051: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22286nvkga28q/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 22286 1726882778.69110: variable 'ansible_facts' from source: unknown 22286 1726882778.69611: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882778.6454618-22364-92194144137739/AnsiballZ_setup.py 22286 1726882778.69884: Sending initial data 22286 1726882778.69887: Sent initial data (153 bytes) 22286 1726882778.70757: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882778.70968: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882778.71016: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882778.71033: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882778.71078: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882778.71311: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 22286 1726882778.73200: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 22286 1726882778.73203: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22286 1726882778.73532: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22286 1726882778.73651: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22286nvkga28q/tmpntaoio9r /root/.ansible/tmp/ansible-tmp-1726882778.6454618-22364-92194144137739/AnsiballZ_setup.py <<< 22286 1726882778.73659: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882778.6454618-22364-92194144137739/AnsiballZ_setup.py" <<< 22286 1726882778.73772: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-22286nvkga28q/tmpntaoio9r" to remote "/root/.ansible/tmp/ansible-tmp-1726882778.6454618-22364-92194144137739/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882778.6454618-22364-92194144137739/AnsiballZ_setup.py" <<< 22286 1726882778.79966: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882778.80135: stderr chunk (state=3): >>><<< 22286 1726882778.80141: stdout chunk (state=3): >>><<< 22286 1726882778.80143: done transferring module to remote 22286 1726882778.80146: _low_level_execute_command(): starting 22286 1726882778.80148: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882778.6454618-22364-92194144137739/ /root/.ansible/tmp/ansible-tmp-1726882778.6454618-22364-92194144137739/AnsiballZ_setup.py && sleep 0' 22286 1726882778.81654: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882778.81735: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882778.81864: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882778.82020: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882778.84121: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882778.84134: stdout chunk (state=3): >>><<< 22286 1726882778.84230: stderr chunk (state=3): >>><<< 22286 1726882778.84235: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882778.84239: _low_level_execute_command(): starting 22286 1726882778.84242: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882778.6454618-22364-92194144137739/AnsiballZ_setup.py && sleep 0' 22286 1726882778.85497: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882778.85721: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882778.85891: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882778.88169: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # <<< 22286 1726882778.88239: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 22286 1726882778.88284: stdout chunk (state=3): >>>import 'posix' # <<< 22286 1726882778.88410: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 22286 1726882778.88449: stdout chunk (state=3): >>>import '_codecs' # <<< 22286 1726882778.88452: stdout chunk (state=3): >>>import 'codecs' # <<< 22286 1726882778.88531: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 22286 1726882778.88582: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 22286 1726882778.88587: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0fb4530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0f83b30> <<< 22286 1726882778.88590: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 22286 1726882778.88838: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0fb6ab0> <<< 22286 1726882778.88845: stdout chunk (state=3): >>>import '_signal' # import '_abc' # <<< 22286 1726882778.88848: stdout chunk (state=3): >>>import 'abc' # <<< 22286 1726882778.88895: stdout chunk (state=3): >>>import 'io' # <<< 22286 1726882778.88904: stdout chunk (state=3): >>>import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 22286 1726882778.88943: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 22286 1726882778.88946: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0d65160> <<< 22286 1726882778.89007: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 22286 1726882778.89028: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0d65fd0> <<< 22286 1726882778.89050: stdout chunk (state=3): >>>import 'site' # <<< 22286 1726882778.89202: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 7 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 22286 1726882778.89486: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 22286 1726882778.89522: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 22286 1726882778.89539: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 22286 1726882778.89555: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 22286 1726882778.89600: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 22286 1726882778.89613: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 22286 1726882778.89645: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 22286 1726882778.89686: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0da3e90> <<< 22286 1726882778.89712: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0da3f50> <<< 22286 1726882778.89744: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 22286 1726882778.89767: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 22286 1726882778.89794: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 22286 1726882778.89846: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 22286 1726882778.89872: stdout chunk (state=3): >>>import 'itertools' # <<< 22286 1726882778.89948: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0ddb8c0> <<< 22286 1726882778.89963: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0ddbf50> import '_collections' # <<< 22286 1726882778.90001: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0dbbb60> <<< 22286 1726882778.90015: stdout chunk (state=3): >>>import '_functools' # <<< 22286 1726882778.90063: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0db9280> <<< 22286 1726882778.90144: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0da1040> <<< 22286 1726882778.90197: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # <<< 22286 1726882778.90243: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 22286 1726882778.90297: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 22286 1726882778.90310: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 22286 1726882778.90323: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0dff800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0dfe420> <<< 22286 1726882778.90366: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' <<< 22286 1726882778.90369: stdout chunk (state=3): >>>import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0dba150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0dfccb0> <<< 22286 1726882778.90422: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 22286 1726882778.90458: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0e30890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0da02c0> <<< 22286 1726882778.90462: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 22286 1726882778.90552: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41f0e30d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0e30bf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41f0e30fb0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0d9ede0> <<< 22286 1726882778.90586: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 22286 1726882778.90672: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 22286 1726882778.90679: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0e31670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0e31340> import 'importlib.machinery' # <<< 22286 1726882778.90716: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 22286 1726882778.90769: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0e32510> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 22286 1726882778.90795: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 22286 1726882778.90860: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0e48740> <<< 22286 1726882778.90906: stdout chunk (state=3): >>>import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41f0e49e20> <<< 22286 1726882778.90970: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py <<< 22286 1726882778.90974: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0e4acf0> <<< 22286 1726882778.91031: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41f0e4b350> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0e4a270> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 22286 1726882778.91190: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 22286 1726882778.91208: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41f0e4bdd0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0e4b500> <<< 22286 1726882778.91230: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0e32570> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 22286 1726882778.91251: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' <<< 22286 1726882778.91278: stdout chunk (state=3): >>># extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41f0b3fc80> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py <<< 22286 1726882778.91317: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41f0b687a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0b68500> <<< 22286 1726882778.91344: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41f0b687d0> <<< 22286 1726882778.91384: stdout chunk (state=3): >>># extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41f0b689b0> <<< 22286 1726882778.91397: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0b3de20> <<< 22286 1726882778.91423: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 22286 1726882778.91667: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0b6a0c0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0b68d40> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0e32c60> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 22286 1726882778.91684: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 22286 1726882778.91708: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 22286 1726882778.91753: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 22286 1726882778.91777: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0b96420> <<< 22286 1726882778.91824: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 22286 1726882778.91881: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 22286 1726882778.91939: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0bb2540> <<< 22286 1726882778.91986: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 22286 1726882778.92007: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 22286 1726882778.92054: stdout chunk (state=3): >>>import 'ntpath' # <<< 22286 1726882778.92088: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0beb2c0> <<< 22286 1726882778.92201: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 22286 1726882778.92226: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 22286 1726882778.92298: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0c11a60> <<< 22286 1726882778.92374: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0beb3e0> <<< 22286 1726882778.92415: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0bb31d0> <<< 22286 1726882778.92444: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py <<< 22286 1726882778.92525: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0a30380> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0bb1580> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0b6afc0> <<< 22286 1726882778.92651: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 22286 1726882778.92665: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f41f0a30560> <<< 22286 1726882778.92836: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_3ofno15o/ansible_setup_payload.zip' <<< 22286 1726882778.92939: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882778.93000: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882778.93070: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 22286 1726882778.93084: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 22286 1726882778.93161: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 22286 1726882778.93196: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0a99fd0> import '_typing' # <<< 22286 1726882778.93400: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0a70ec0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0a33f50> <<< 22286 1726882778.93430: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible' # <<< 22286 1726882778.93541: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 22286 1726882778.93545: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882778.93558: stdout chunk (state=3): >>>import 'ansible.module_utils' # # zipimport: zlib available <<< 22286 1726882778.95079: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882778.96385: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0a73e60> <<< 22286 1726882778.96480: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 22286 1726882778.96515: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41f0ac99a0> <<< 22286 1726882778.96552: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0ac9730> <<< 22286 1726882778.96590: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0ac9040> <<< 22286 1726882778.96650: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 22286 1726882778.96656: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 22286 1726882778.96676: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0ac9a90> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0a9ac60> import 'atexit' # <<< 22286 1726882778.96748: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41f0aca750> <<< 22286 1726882778.96751: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41f0aca990> <<< 22286 1726882778.96867: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 22286 1726882778.96871: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # <<< 22286 1726882778.96886: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0acaed0> import 'pwd' # <<< 22286 1726882778.97065: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0930b90> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41f09327b0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 22286 1726882778.97082: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0933140> <<< 22286 1726882778.97107: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 22286 1726882778.97130: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 22286 1726882778.97153: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0934320> <<< 22286 1726882778.97172: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 22286 1726882778.97340: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0936db0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41f0937110> <<< 22286 1726882778.97377: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0935070> <<< 22286 1726882778.97381: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 22286 1726882778.97411: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 22286 1726882778.97455: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 22286 1726882778.97459: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 22286 1726882778.97490: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 22286 1726882778.97517: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 22286 1726882778.97544: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f093ade0> <<< 22286 1726882778.97621: stdout chunk (state=3): >>>import '_tokenize' # <<< 22286 1726882778.97648: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f09398b0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0939610> <<< 22286 1726882778.97671: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 22286 1726882778.97772: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f093be30> <<< 22286 1726882778.97911: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0935580> <<< 22286 1726882778.97915: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41f097ef00> <<< 22286 1726882778.97918: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f097f080> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 22286 1726882778.97941: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' <<< 22286 1726882778.97966: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41f0980c20> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f09809e0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 22286 1726882778.98084: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 22286 1726882778.98203: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 22286 1726882778.98231: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41f0983170> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f09812e0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 22286 1726882778.98257: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 22286 1726882778.98311: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f098e960> <<< 22286 1726882778.98471: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f09832f0> <<< 22286 1726882778.98553: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41f098fbf0> <<< 22286 1726882778.98706: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41f098f770> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41f098fcb0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f097f380> <<< 22286 1726882778.98756: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 22286 1726882778.98764: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 22286 1726882778.98766: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 22286 1726882778.98802: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 22286 1726882778.98805: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41f09932c0> <<< 22286 1726882778.99151: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 22286 1726882778.99154: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41f09942f0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0991a30> <<< 22286 1726882778.99179: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41f0992de0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0991640> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 22286 1726882778.99217: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882778.99327: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882778.99355: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 22286 1726882778.99380: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 22286 1726882778.99511: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882778.99555: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882778.99699: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.00369: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.01052: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 22286 1726882779.01146: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 22286 1726882779.01173: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41f081c4d0> <<< 22286 1726882779.01290: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 22286 1726882779.01327: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f081d340> <<< 22286 1726882779.01330: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f09979b0> <<< 22286 1726882779.01392: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 22286 1726882779.01423: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.01546: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils._text' # <<< 22286 1726882779.01549: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.01630: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.01816: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 22286 1726882779.01896: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f081d310> <<< 22286 1726882779.01909: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.02425: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.02996: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.03116: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.03164: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 22286 1726882779.03365: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 22286 1726882779.03374: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available <<< 22286 1726882779.03467: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 22286 1726882779.03577: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 22286 1726882779.03673: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 22286 1726882779.03954: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.04201: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 22286 1726882779.04314: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 22286 1726882779.04432: stdout chunk (state=3): >>>import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f081fcb0> # zipimport: zlib available <<< 22286 1726882779.04590: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.04594: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 22286 1726882779.04645: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 22286 1726882779.04823: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41f0825d90> <<< 22286 1726882779.05115: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41f0826720> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f081e990> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 22286 1726882779.05119: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.05199: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 22286 1726882779.05265: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 22286 1726882779.05354: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41f0825490> <<< 22286 1726882779.05372: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0826810> <<< 22286 1726882779.05441: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 22286 1726882779.05569: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 22286 1726882779.05588: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.05624: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 22286 1726882779.05655: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 22286 1726882779.05673: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 22286 1726882779.05784: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 22286 1726882779.05797: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 22286 1726882779.05872: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f08b68d0> <<< 22286 1726882779.05895: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0830560> <<< 22286 1726882779.06001: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f082a6c0> <<< 22286 1726882779.06027: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f082a540> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available <<< 22286 1726882779.06049: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.06073: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 22286 1726882779.06390: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 22286 1726882779.06420: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.06484: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.06543: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.06597: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.06681: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available <<< 22286 1726882779.06868: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.06907: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.06945: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.07005: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 22286 1726882779.07008: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.07322: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.07627: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.07712: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.07890: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 22286 1726882779.07893: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 22286 1726882779.07925: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 22286 1726882779.07968: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f08bd3d0> <<< 22286 1726882779.08002: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 22286 1726882779.08031: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 22286 1726882779.08111: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 22286 1726882779.08128: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 22286 1726882779.08153: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41efd47f80> <<< 22286 1726882779.08198: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41efd482c0> <<< 22286 1726882779.08282: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f08a1040> <<< 22286 1726882779.08346: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f08a2390> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f08bf800> <<< 22286 1726882779.08372: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f08bf140> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 22286 1726882779.08537: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 22286 1726882779.08570: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41efd4b2c0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41efd4ab70> <<< 22286 1726882779.08592: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41efd4ad50> <<< 22286 1726882779.08608: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41efd49fa0> <<< 22286 1726882779.08645: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 22286 1726882779.08787: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 22286 1726882779.08831: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41efd4b380> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 22286 1726882779.08868: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 22286 1726882779.08945: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41efdb5eb0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41efd4be90> <<< 22286 1726882779.08987: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f08bf380> import 'ansible.module_utils.facts.timeout' # <<< 22286 1726882779.09011: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # <<< 22286 1726882779.09053: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.09068: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available <<< 22286 1726882779.09192: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.09220: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 22286 1726882779.09239: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.09301: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.09366: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 22286 1726882779.09397: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 22286 1726882779.09400: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system' # <<< 22286 1726882779.09411: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.09452: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.09492: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available <<< 22286 1726882779.09637: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available <<< 22286 1726882779.09665: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.09698: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 22286 1726882779.09701: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.09764: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.10341: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available <<< 22286 1726882779.10724: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.11342: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 22286 1726882779.11365: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # <<< 22286 1726882779.11384: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.11666: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available <<< 22286 1726882779.11669: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 22286 1726882779.11761: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available <<< 22286 1726882779.11837: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.11928: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 22286 1726882779.11947: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 22286 1726882779.11968: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41efdb7860> <<< 22286 1726882779.11983: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 22286 1726882779.12156: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41efdb6b10> import 'ansible.module_utils.facts.system.local' # <<< 22286 1726882779.12182: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.12242: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.12310: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 22286 1726882779.12323: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.12417: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.12601: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available <<< 22286 1726882779.12678: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 22286 1726882779.12724: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.12746: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.12785: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 22286 1726882779.13050: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41efdea1e0> <<< 22286 1726882779.13374: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41efdd5fd0> import 'ansible.module_utils.facts.system.python' # <<< 22286 1726882779.13398: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.13502: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.13564: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available <<< 22286 1726882779.13701: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.13940: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.14040: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.14304: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # <<< 22286 1726882779.14307: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.service_mgr' # <<< 22286 1726882779.14322: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.14363: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.14431: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available <<< 22286 1726882779.14487: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.14604: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 22286 1726882779.14630: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41efc01760> <<< 22286 1726882779.14669: stdout chunk (state=3): >>>import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41efdd7980> <<< 22286 1726882779.14673: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.user' # <<< 22286 1726882779.14693: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # <<< 22286 1726882779.14750: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.14771: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.14800: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 22286 1726882779.14821: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.15099: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.15363: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 22286 1726882779.15397: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.15653: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.15806: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.15810: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.15852: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # <<< 22286 1726882779.15880: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.15927: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.15931: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.16260: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.16338: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available <<< 22286 1726882779.16606: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # <<< 22286 1726882779.16609: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.16637: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.16677: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.17291: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.18165: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available <<< 22286 1726882779.18303: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.18477: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 22286 1726882779.18494: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.18657: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.18816: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 22286 1726882779.18881: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.19101: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.19471: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 22286 1726882779.19505: stdout chunk (state=3): >>> # zipimport: zlib available <<< 22286 1726882779.19533: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available<<< 22286 1726882779.19670: stdout chunk (state=3): >>> # zipimport: zlib available <<< 22286 1726882779.19798: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 22286 1726882779.19972: stdout chunk (state=3): >>> # zipimport: zlib available <<< 22286 1726882779.19975: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.20140: stdout chunk (state=3): >>># zipimport: zlib available<<< 22286 1726882779.20343: stdout chunk (state=3): >>> <<< 22286 1726882779.20551: stdout chunk (state=3): >>># zipimport: zlib available<<< 22286 1726882779.20573: stdout chunk (state=3): >>> <<< 22286 1726882779.20961: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # <<< 22286 1726882779.20986: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.aix' # <<< 22286 1726882779.21011: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.21079: stdout chunk (state=3): >>># zipimport: zlib available<<< 22286 1726882779.21101: stdout chunk (state=3): >>> <<< 22286 1726882779.21152: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 22286 1726882779.21186: stdout chunk (state=3): >>> # zipimport: zlib available<<< 22286 1726882779.21213: stdout chunk (state=3): >>> <<< 22286 1726882779.21229: stdout chunk (state=3): >>># zipimport: zlib available<<< 22286 1726882779.21254: stdout chunk (state=3): >>> <<< 22286 1726882779.21310: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # <<< 22286 1726882779.21314: stdout chunk (state=3): >>> <<< 22286 1726882779.21333: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.21468: stdout chunk (state=3): >>># zipimport: zlib available<<< 22286 1726882779.21481: stdout chunk (state=3): >>> <<< 22286 1726882779.21606: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 22286 1726882779.21642: stdout chunk (state=3): >>> # zipimport: zlib available<<< 22286 1726882779.21653: stdout chunk (state=3): >>> <<< 22286 1726882779.21712: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # <<< 22286 1726882779.21742: stdout chunk (state=3): >>> # zipimport: zlib available<<< 22286 1726882779.21766: stdout chunk (state=3): >>> <<< 22286 1726882779.21951: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # <<< 22286 1726882779.21980: stdout chunk (state=3): >>> # zipimport: zlib available<<< 22286 1726882779.22003: stdout chunk (state=3): >>> <<< 22286 1726882779.22094: stdout chunk (state=3): >>># zipimport: zlib available<<< 22286 1726882779.22115: stdout chunk (state=3): >>> <<< 22286 1726882779.22197: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 22286 1726882779.22229: stdout chunk (state=3): >>> <<< 22286 1726882779.22254: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.22751: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.23271: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available <<< 22286 1726882779.23483: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # <<< 22286 1726882779.23505: stdout chunk (state=3): >>> # zipimport: zlib available <<< 22286 1726882779.23575: stdout chunk (state=3): >>># zipimport: zlib available<<< 22286 1726882779.23589: stdout chunk (state=3): >>> <<< 22286 1726882779.23657: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 22286 1726882779.23683: stdout chunk (state=3): >>> # zipimport: zlib available <<< 22286 1726882779.23748: stdout chunk (state=3): >>># zipimport: zlib available<<< 22286 1726882779.23751: stdout chunk (state=3): >>> <<< 22286 1726882779.23828: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available<<< 22286 1726882779.23831: stdout chunk (state=3): >>> <<< 22286 1726882779.23986: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # <<< 22286 1726882779.24010: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.24271: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.24277: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 22286 1726882779.24300: stdout chunk (state=3): >>> # zipimport: zlib available <<< 22286 1726882779.24354: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available<<< 22286 1726882779.24357: stdout chunk (state=3): >>> <<< 22286 1726882779.24520: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # <<< 22286 1726882779.24526: stdout chunk (state=3): >>># zipimport: zlib available<<< 22286 1726882779.24579: stdout chunk (state=3): >>> # zipimport: zlib available <<< 22286 1726882779.24626: stdout chunk (state=3): >>># zipimport: zlib available<<< 22286 1726882779.24629: stdout chunk (state=3): >>> <<< 22286 1726882779.24804: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 22286 1726882779.24963: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.25092: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # <<< 22286 1726882779.25141: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # <<< 22286 1726882779.25146: stdout chunk (state=3): >>> <<< 22286 1726882779.25172: stdout chunk (state=3): >>># zipimport: zlib available<<< 22286 1726882779.25259: stdout chunk (state=3): >>> <<< 22286 1726882779.25280: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.25377: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # <<< 22286 1726882779.25386: stdout chunk (state=3): >>> <<< 22286 1726882779.25417: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.25780: stdout chunk (state=3): >>># zipimport: zlib available<<< 22286 1726882779.25806: stdout chunk (state=3): >>> <<< 22286 1726882779.26179: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 22286 1726882779.26201: stdout chunk (state=3): >>> # zipimport: zlib available<<< 22286 1726882779.26280: stdout chunk (state=3): >>> # zipimport: zlib available<<< 22286 1726882779.26303: stdout chunk (state=3): >>> <<< 22286 1726882779.26365: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 22286 1726882779.26396: stdout chunk (state=3): >>> <<< 22286 1726882779.26423: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.26494: stdout chunk (state=3): >>># zipimport: zlib available<<< 22286 1726882779.26510: stdout chunk (state=3): >>> <<< 22286 1726882779.26567: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 22286 1726882779.26600: stdout chunk (state=3): >>> # zipimport: zlib available<<< 22286 1726882779.26762: stdout chunk (state=3): >>> # zipimport: zlib available <<< 22286 1726882779.26913: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # <<< 22286 1726882779.26938: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.default_collectors' # <<< 22286 1726882779.26979: stdout chunk (state=3): >>># zipimport: zlib available<<< 22286 1726882779.26990: stdout chunk (state=3): >>> <<< 22286 1726882779.27143: stdout chunk (state=3): >>># zipimport: zlib available<<< 22286 1726882779.27155: stdout chunk (state=3): >>> <<< 22286 1726882779.27333: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # <<< 22286 1726882779.27362: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.compat' # <<< 22286 1726882779.27374: stdout chunk (state=3): >>>import 'ansible.module_utils.facts' # <<< 22286 1726882779.27500: stdout chunk (state=3): >>># zipimport: zlib available<<< 22286 1726882779.27518: stdout chunk (state=3): >>> <<< 22286 1726882779.29101: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py<<< 22286 1726882779.29126: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 22286 1726882779.29200: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc'<<< 22286 1726882779.29218: stdout chunk (state=3): >>> <<< 22286 1726882779.29243: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so'<<< 22286 1726882779.29293: stdout chunk (state=3): >>> # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41efc2f500> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41efc2cd10> <<< 22286 1726882779.29450: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41efc2cfb0> <<< 22286 1726882779.29880: stdout chunk (state=3): >>> <<< 22286 1726882779.29924: stdout chunk (state=3): >>>{"ansible_facts": {"ansible_apparmor": {"status": "disabled"}, "ansible_local": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.145 37172 10.31.41.238 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.145 37172 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_cmdline": {"BOOT_I<<< 22286 1726882779.29939: stdout chunk (state=3): >>>MAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_pkg_mgr": "dnf", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANYijgij1fEhTOf5yay/qzv1+ckF/sTeAcrQU7mSl4JlHlSjJFRS9ZOEcTyZhIM24bmrUmUAXGByisr1fJhHdM1w6H4TV9d8eAGz5dvqRt3OMFXU98TIudAuK2zln4nrfCSz2a6X/3opBJuckX9rZaO0ickijdGATG1zU5j6yse5AAAAFQCB9h5S0fKeFdZzGlNOZp7suEtGLQAAAIBqL8YusJlS5M+t8hqB5XoiVX2JRwxeO45o1F+YDEL8s88gEWv3QxNNB5xqhdMrEbA13n8FJWfZZdvcU7PONunHJRbKJZFHcCdK5TI9eGObNVaZTYNSFhZ2BieAeUf4m7eiHxQI/o71WHee8ehKt8oSXovKXzKzzh6V8adityCM0QAAAIBN41A6QjMEnqm1991CkEko30YVBdWgdcunoDD7NJJoONNTR054WsZIbydxWyYVFa3fC4HcmSYJjXHxuSZuCmzFZYOzQSedVWWiET/kLEvIDxOZEQ44DCsa3zg29Ty97IbNNwLSIOFXoUbWllCQV9qge0q5dQ/J1wTQdymso3DyLQ==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC9Kn//qIm4a6eEMMZR4qpdJnSJr3FLIr3UgGTmVZPamYNdQ29vbdZONWFxDxoVqR69dvgMi7B4CVdaUu6QyBPlMI1hnnTMD7yGuFvf0wDLvk2p5tQf1MwOc8WdJCPqkvcCYTOD4gBf/qOT2LG96u3e6y9NpDSs43WwFzV2YMOpEVnbg+17SPjuOyE07jTJi4gLXbcjXxt9rz8nQMlsQPFysakPATk6pjVZnnTWDcFUSfc1sUdO6IWl/O4jlB/QtP/FkO38YQbSYx1fiZNsk+JP6ZeZ4F0trwlxRemd9P6eEqtA9jVdvSCvJNHgZoMob64uw1c2P8BFaAByky5crE35ggw6pKcQTHTAHhrPBTx12gpwlL4rB+OoysKYhxI8VeW+TYiNWBxF+EUpmcj/QMfOOgNbIEeK+YfNZ606vwhkyjORVqaN3MswYozhtwmAoyxDKaTAYWXo4+d+GqZ7pURKpwdZrI8M7e8Nvd+dwpW3OtrfAqXvFwIrBrivFfWnDE0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBsLvf+TKoJqIfm5H0y7RP9w7PN9<<< 22286 1726882779.30065: stdout chunk (state=3): >>>9SnDATkd0bkTPwuIbQqBA6MAihYQaVCQtnKQCWC09GNZyMQSeayjLONajkY=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIME0e8Dw9KPaZbGCYYNAh3+j3dHxYuGhpELosAmEvhOR", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_lsb": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "39", "second": "39", "epoch": "1726882779", "epoch_int": "1726882779", "date": "2024-09-20", "time": "21:39:39", "iso8601_micro": "2024-09-21T01:39:39.288794Z", "iso8601": "2024-09-21T01:39:39Z", "iso8601_basic": "20240920T213939288794", "iso8601_basic_short": "20240920T213939", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system": "Linux", "ansible_kernel": "6.10.9-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 9 02:28:01 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-41-238.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-41-238", "ansible_nodename": "ip-10-31-41-238.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec243b3f63949f99a85dd461938b27f6", "ansible_fips": false, "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 22286 1726882779.30812: stdout chunk (state=3): >>># clear sys.path_importer_cache<<< 22286 1726882779.30845: stdout chunk (state=3): >>> # clear sys.path_hooks # clear builtins._ <<< 22286 1726882779.30893: stdout chunk (state=3): >>># clear sys.path # clear sys.argv # clear sys.ps1<<< 22286 1726882779.30905: stdout chunk (state=3): >>> # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__<<< 22286 1726882779.30943: stdout chunk (state=3): >>> # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings<<< 22286 1726882779.30967: stdout chunk (state=3): >>> # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc<<< 22286 1726882779.31021: stdout chunk (state=3): >>> # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types<<< 22286 1726882779.31028: stdout chunk (state=3): >>> # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib<<< 22286 1726882779.31066: stdout chunk (state=3): >>> # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib<<< 22286 1726882779.31115: stdout chunk (state=3): >>> # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma<<< 22286 1726882779.31128: stdout chunk (state=3): >>> # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading<<< 22286 1726882779.31172: stdout chunk (state=3): >>> # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437<<< 22286 1726882779.31187: stdout chunk (state=3): >>> # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ <<< 22286 1726882779.31221: stdout chunk (state=3): >>># destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select <<< 22286 1726882779.31247: stdout chunk (state=3): >>># cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd<<< 22286 1726882779.31281: stdout chunk (state=3): >>> # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal<<< 22286 1726882779.31337: stdout chunk (state=3): >>> # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common<<< 22286 1726882779.31346: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes<<< 22286 1726882779.31381: stdout chunk (state=3): >>> # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast<<< 22286 1726882779.31403: stdout chunk (state=3): >>> # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec<<< 22286 1726882779.31433: stdout chunk (state=3): >>> # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux<<< 22286 1726882779.31473: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic<<< 22286 1726882779.31490: stdout chunk (state=3): >>> # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle<<< 22286 1726882779.31516: stdout chunk (state=3): >>> # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue <<< 22286 1726882779.31571: stdout chunk (state=3): >>># cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system<<< 22286 1726882779.31580: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob <<< 22286 1726882779.31614: stdout chunk (state=3): >>># cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys<<< 22286 1726882779.31632: stdout chunk (state=3): >>> # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos<<< 22286 1726882779.31669: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl<<< 22286 1726882779.31714: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat<<< 22286 1726882779.31752: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns<<< 22286 1726882779.31805: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base<<< 22286 1726882779.31811: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network<<< 22286 1726882779.31855: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme<<< 22286 1726882779.31858: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd<<< 22286 1726882779.32052: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 22286 1726882779.32426: stdout chunk (state=3): >>># destroy _sitebuiltins<<< 22286 1726882779.32467: stdout chunk (state=3): >>> # destroy importlib.machinery<<< 22286 1726882779.32501: stdout chunk (state=3): >>> # destroy importlib._abc # destroy importlib.util <<< 22286 1726882779.32564: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression<<< 22286 1726882779.32569: stdout chunk (state=3): >>> <<< 22286 1726882779.32644: stdout chunk (state=3): >>># destroy _lzma # destroy binascii # destroy zlib # destroy bz2 <<< 22286 1726882779.32648: stdout chunk (state=3): >>># destroy lzma <<< 22286 1726882779.32707: stdout chunk (state=3): >>># destroy zipfile._path # destroy zipfile <<< 22286 1726882779.32723: stdout chunk (state=3): >>># destroy pathlib # destroy zipfile._path.glob # destroy ipaddress<<< 22286 1726882779.32777: stdout chunk (state=3): >>> # destroy ntpath<<< 22286 1726882779.32823: stdout chunk (state=3): >>> # destroy importlib <<< 22286 1726882779.32869: stdout chunk (state=3): >>># destroy zipimport # destroy __main__<<< 22286 1726882779.32888: stdout chunk (state=3): >>> # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json<<< 22286 1726882779.32946: stdout chunk (state=3): >>> # destroy grp<<< 22286 1726882779.32951: stdout chunk (state=3): >>> # destroy encodings # destroy _locale # destroy locale<<< 22286 1726882779.32995: stdout chunk (state=3): >>> # destroy select # destroy _signal # destroy _posixsubprocess<<< 22286 1726882779.32998: stdout chunk (state=3): >>> # destroy syslog<<< 22286 1726882779.33084: stdout chunk (state=3): >>> # destroy uuid # destroy _hashlib<<< 22286 1726882779.33116: stdout chunk (state=3): >>> # destroy _blake2 <<< 22286 1726882779.33133: stdout chunk (state=3): >>># destroy selinux # destroy shutil <<< 22286 1726882779.33155: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse<<< 22286 1726882779.33262: stdout chunk (state=3): >>> # destroy logging # destroy ansible.module_utils.facts.default_collectors <<< 22286 1726882779.33265: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.ansible_collector <<< 22286 1726882779.33318: stdout chunk (state=3): >>># destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context<<< 22286 1726882779.33469: stdout chunk (state=3): >>> # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl <<< 22286 1726882779.33495: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux <<< 22286 1726882779.33579: stdout chunk (state=3): >>># destroy getpass # destroy pwd # destroy termios <<< 22286 1726882779.33585: stdout chunk (state=3): >>># destroy errno # destroy json <<< 22286 1726882779.33631: stdout chunk (state=3): >>># destroy socket # destroy struct <<< 22286 1726882779.33661: stdout chunk (state=3): >>># destroy glob <<< 22286 1726882779.33687: stdout chunk (state=3): >>># destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout<<< 22286 1726882779.33760: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna<<< 22286 1726882779.33803: stdout chunk (state=3): >>> # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes<<< 22286 1726882779.33851: stdout chunk (state=3): >>> # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128<<< 22286 1726882779.33855: stdout chunk (state=3): >>> # cleanup[3] wiping systemd._reader<<< 22286 1726882779.33915: stdout chunk (state=3): >>> # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize<<< 22286 1726882779.33918: stdout chunk (state=3): >>> # cleanup[3] wiping platform # cleanup[3] wiping atexit<<< 22286 1726882779.33952: stdout chunk (state=3): >>> # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2<<< 22286 1726882779.33984: stdout chunk (state=3): >>> # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external<<< 22286 1726882779.34017: stdout chunk (state=3): >>> # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re<<< 22286 1726882779.34060: stdout chunk (state=3): >>> # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre<<< 22286 1726882779.34124: stdout chunk (state=3): >>> # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator<<< 22286 1726882779.34130: stdout chunk (state=3): >>> # cleanup[3] wiping _operator # cleanup[3] wiping types<<< 22286 1726882779.34196: stdout chunk (state=3): >>> # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat<<< 22286 1726882779.34199: stdout chunk (state=3): >>> # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal<<< 22286 1726882779.34222: stdout chunk (state=3): >>> # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys<<< 22286 1726882779.34264: stdout chunk (state=3): >>> # cleanup[3] wiping builtins # destroy selinux._selinux<<< 22286 1726882779.34285: stdout chunk (state=3): >>> # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal<<< 22286 1726882779.34472: stdout chunk (state=3): >>> # destroy _datetime <<< 22286 1726882779.34572: stdout chunk (state=3): >>># destroy sys.monitoring <<< 22286 1726882779.34633: stdout chunk (state=3): >>># destroy _socket <<< 22286 1726882779.34642: stdout chunk (state=3): >>># destroy _collections <<< 22286 1726882779.34704: stdout chunk (state=3): >>># destroy platform # destroy _uuid <<< 22286 1726882779.34747: stdout chunk (state=3): >>># destroy stat # destroy genericpath # destroy re._parser <<< 22286 1726882779.34764: stdout chunk (state=3): >>># destroy tokenize <<< 22286 1726882779.34822: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib<<< 22286 1726882779.34825: stdout chunk (state=3): >>> <<< 22286 1726882779.34853: stdout chunk (state=3): >>># destroy copyreg # destroy contextlib <<< 22286 1726882779.34887: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse<<< 22286 1726882779.34916: stdout chunk (state=3): >>> # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves<<< 22286 1726882779.34960: stdout chunk (state=3): >>> # destroy _frozen_importlib_external <<< 22286 1726882779.35017: stdout chunk (state=3): >>># destroy _imp # destroy _io # destroy marshal # clear sys.meta_path<<< 22286 1726882779.35023: stdout chunk (state=3): >>> <<< 22286 1726882779.35055: stdout chunk (state=3): >>># clear sys.modules # destroy _frozen_importlib <<< 22286 1726882779.35252: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time<<< 22286 1726882779.35474: stdout chunk (state=3): >>> # destroy _random # destroy _weakref<<< 22286 1726882779.35477: stdout chunk (state=3): >>> # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks<<< 22286 1726882779.35542: stdout chunk (state=3): >>> <<< 22286 1726882779.36148: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. <<< 22286 1726882779.36203: stderr chunk (state=3): >>><<< 22286 1726882779.36218: stdout chunk (state=3): >>><<< 22286 1726882779.36391: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0fb4530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0f83b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0fb6ab0> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0d65160> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0d65fd0> import 'site' # Python 3.12.5 (main, Aug 7 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0da3e90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0da3f50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0ddb8c0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0ddbf50> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0dbbb60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0db9280> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0da1040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0dff800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0dfe420> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0dba150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0dfccb0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0e30890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0da02c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41f0e30d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0e30bf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41f0e30fb0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0d9ede0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0e31670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0e31340> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0e32510> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0e48740> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41f0e49e20> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0e4acf0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41f0e4b350> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0e4a270> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41f0e4bdd0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0e4b500> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0e32570> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41f0b3fc80> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41f0b687a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0b68500> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41f0b687d0> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41f0b689b0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0b3de20> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0b6a0c0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0b68d40> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0e32c60> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0b96420> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0bb2540> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0beb2c0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0c11a60> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0beb3e0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0bb31d0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0a30380> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0bb1580> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0b6afc0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f41f0a30560> # zipimport: found 103 names in '/tmp/ansible_setup_payload_3ofno15o/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0a99fd0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0a70ec0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0a33f50> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0a73e60> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41f0ac99a0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0ac9730> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0ac9040> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0ac9a90> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0a9ac60> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41f0aca750> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41f0aca990> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0acaed0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0930b90> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41f09327b0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0933140> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0934320> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0936db0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41f0937110> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0935070> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f093ade0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f09398b0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0939610> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f093be30> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0935580> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41f097ef00> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f097f080> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41f0980c20> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f09809e0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41f0983170> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f09812e0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f098e960> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f09832f0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41f098fbf0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41f098f770> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41f098fcb0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f097f380> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41f09932c0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41f09942f0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0991a30> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41f0992de0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0991640> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41f081c4d0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f081d340> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f09979b0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f081d310> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f081fcb0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41f0825d90> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41f0826720> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f081e990> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41f0825490> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0826810> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f08b68d0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f0830560> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f082a6c0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f082a540> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f08bd3d0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41efd47f80> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41efd482c0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f08a1040> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f08a2390> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f08bf800> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f08bf140> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41efd4b2c0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41efd4ab70> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41efd4ad50> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41efd49fa0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41efd4b380> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41efdb5eb0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41efd4be90> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41f08bf380> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41efdb7860> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41efdb6b10> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41efdea1e0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41efdd5fd0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41efc01760> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41efdd7980> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f41efc2f500> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41efc2cd10> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f41efc2cfb0> {"ansible_facts": {"ansible_apparmor": {"status": "disabled"}, "ansible_local": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.145 37172 10.31.41.238 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.145 37172 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_pkg_mgr": "dnf", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANYijgij1fEhTOf5yay/qzv1+ckF/sTeAcrQU7mSl4JlHlSjJFRS9ZOEcTyZhIM24bmrUmUAXGByisr1fJhHdM1w6H4TV9d8eAGz5dvqRt3OMFXU98TIudAuK2zln4nrfCSz2a6X/3opBJuckX9rZaO0ickijdGATG1zU5j6yse5AAAAFQCB9h5S0fKeFdZzGlNOZp7suEtGLQAAAIBqL8YusJlS5M+t8hqB5XoiVX2JRwxeO45o1F+YDEL8s88gEWv3QxNNB5xqhdMrEbA13n8FJWfZZdvcU7PONunHJRbKJZFHcCdK5TI9eGObNVaZTYNSFhZ2BieAeUf4m7eiHxQI/o71WHee8ehKt8oSXovKXzKzzh6V8adityCM0QAAAIBN41A6QjMEnqm1991CkEko30YVBdWgdcunoDD7NJJoONNTR054WsZIbydxWyYVFa3fC4HcmSYJjXHxuSZuCmzFZYOzQSedVWWiET/kLEvIDxOZEQ44DCsa3zg29Ty97IbNNwLSIOFXoUbWllCQV9qge0q5dQ/J1wTQdymso3DyLQ==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC9Kn//qIm4a6eEMMZR4qpdJnSJr3FLIr3UgGTmVZPamYNdQ29vbdZONWFxDxoVqR69dvgMi7B4CVdaUu6QyBPlMI1hnnTMD7yGuFvf0wDLvk2p5tQf1MwOc8WdJCPqkvcCYTOD4gBf/qOT2LG96u3e6y9NpDSs43WwFzV2YMOpEVnbg+17SPjuOyE07jTJi4gLXbcjXxt9rz8nQMlsQPFysakPATk6pjVZnnTWDcFUSfc1sUdO6IWl/O4jlB/QtP/FkO38YQbSYx1fiZNsk+JP6ZeZ4F0trwlxRemd9P6eEqtA9jVdvSCvJNHgZoMob64uw1c2P8BFaAByky5crE35ggw6pKcQTHTAHhrPBTx12gpwlL4rB+OoysKYhxI8VeW+TYiNWBxF+EUpmcj/QMfOOgNbIEeK+YfNZ606vwhkyjORVqaN3MswYozhtwmAoyxDKaTAYWXo4+d+GqZ7pURKpwdZrI8M7e8Nvd+dwpW3OtrfAqXvFwIrBrivFfWnDE0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBsLvf+TKoJqIfm5H0y7RP9w7PN99SnDATkd0bkTPwuIbQqBA6MAihYQaVCQtnKQCWC09GNZyMQSeayjLONajkY=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIME0e8Dw9KPaZbGCYYNAh3+j3dHxYuGhpELosAmEvhOR", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_lsb": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "39", "second": "39", "epoch": "1726882779", "epoch_int": "1726882779", "date": "2024-09-20", "time": "21:39:39", "iso8601_micro": "2024-09-21T01:39:39.288794Z", "iso8601": "2024-09-21T01:39:39Z", "iso8601_basic": "20240920T213939288794", "iso8601_basic_short": "20240920T213939", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system": "Linux", "ansible_kernel": "6.10.9-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 9 02:28:01 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-41-238.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-41-238", "ansible_nodename": "ip-10-31-41-238.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec243b3f63949f99a85dd461938b27f6", "ansible_fips": false, "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 22286 1726882779.37800: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882778.6454618-22364-92194144137739/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22286 1726882779.37803: _low_level_execute_command(): starting 22286 1726882779.37805: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882778.6454618-22364-92194144137739/ > /dev/null 2>&1 && sleep 0' 22286 1726882779.38108: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882779.38142: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882779.38154: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882779.38177: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882779.38328: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882779.41209: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882779.41213: stdout chunk (state=3): >>><<< 22286 1726882779.41216: stderr chunk (state=3): >>><<< 22286 1726882779.41240: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882779.41341: handler run complete 22286 1726882779.41345: variable 'ansible_facts' from source: unknown 22286 1726882779.41421: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882779.41629: variable 'ansible_facts' from source: unknown 22286 1726882779.41726: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882779.41827: attempt loop complete, returning result 22286 1726882779.41838: _execute() done 22286 1726882779.41846: dumping result to json 22286 1726882779.41865: done dumping result, returning 22286 1726882779.41903: done running TaskExecutor() for managed_node3/TASK: Gather the minimum subset of ansible_facts required by the network role test [0affe814-3a2d-a75d-4836-0000000000ca] 22286 1726882779.41906: sending task result for task 0affe814-3a2d-a75d-4836-0000000000ca 22286 1726882779.42319: done sending task result for task 0affe814-3a2d-a75d-4836-0000000000ca 22286 1726882779.42322: WORKER PROCESS EXITING ok: [managed_node3] 22286 1726882779.42495: no more pending results, returning what we have 22286 1726882779.42499: results queue empty 22286 1726882779.42500: checking for any_errors_fatal 22286 1726882779.42501: done checking for any_errors_fatal 22286 1726882779.42502: checking for max_fail_percentage 22286 1726882779.42504: done checking for max_fail_percentage 22286 1726882779.42505: checking to see if all hosts have failed and the running result is not ok 22286 1726882779.42506: done checking to see if all hosts have failed 22286 1726882779.42507: getting the remaining hosts for this loop 22286 1726882779.42509: done getting the remaining hosts for this loop 22286 1726882779.42514: getting the next task for host managed_node3 22286 1726882779.42524: done getting next task for host managed_node3 22286 1726882779.42527: ^ task is: TASK: Check if system is ostree 22286 1726882779.42531: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882779.42638: getting variables 22286 1726882779.42641: in VariableManager get_vars() 22286 1726882779.42680: Calling all_inventory to load vars for managed_node3 22286 1726882779.42684: Calling groups_inventory to load vars for managed_node3 22286 1726882779.42688: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882779.42699: Calling all_plugins_play to load vars for managed_node3 22286 1726882779.42703: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882779.42707: Calling groups_plugins_play to load vars for managed_node3 22286 1726882779.43064: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882779.43425: done with get_vars() 22286 1726882779.43439: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Friday 20 September 2024 21:39:39 -0400 (0:00:00.870) 0:00:02.828 ****** 22286 1726882779.43554: entering _queue_task() for managed_node3/stat 22286 1726882779.43961: worker is 1 (out of 1 available) 22286 1726882779.43973: exiting _queue_task() for managed_node3/stat 22286 1726882779.43985: done queuing things up, now waiting for results queue to drain 22286 1726882779.43986: waiting for pending results... 22286 1726882779.44225: running TaskExecutor() for managed_node3/TASK: Check if system is ostree 22286 1726882779.44322: in run() - task 0affe814-3a2d-a75d-4836-0000000000cc 22286 1726882779.44326: variable 'ansible_search_path' from source: unknown 22286 1726882779.44329: variable 'ansible_search_path' from source: unknown 22286 1726882779.44369: calling self._execute() 22286 1726882779.44468: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882779.44496: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882779.44539: variable 'omit' from source: magic vars 22286 1726882779.45119: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22286 1726882779.45486: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22286 1726882779.45567: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22286 1726882779.45723: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22286 1726882779.45744: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22286 1726882779.45886: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22286 1726882779.45935: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22286 1726882779.45982: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882779.46029: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22286 1726882779.46196: Evaluated conditional (not __network_is_ostree is defined): True 22286 1726882779.46209: variable 'omit' from source: magic vars 22286 1726882779.46289: variable 'omit' from source: magic vars 22286 1726882779.46352: variable 'omit' from source: magic vars 22286 1726882779.46393: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22286 1726882779.46461: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22286 1726882779.46469: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22286 1726882779.46546: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882779.46549: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882779.46572: variable 'inventory_hostname' from source: host vars for 'managed_node3' 22286 1726882779.46681: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882779.46686: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882779.46753: Set connection var ansible_shell_executable to /bin/sh 22286 1726882779.46770: Set connection var ansible_module_compression to ZIP_DEFLATED 22286 1726882779.46787: Set connection var ansible_connection to ssh 22286 1726882779.46800: Set connection var ansible_shell_type to sh 22286 1726882779.46819: Set connection var ansible_timeout to 10 22286 1726882779.46837: Set connection var ansible_pipelining to False 22286 1726882779.46869: variable 'ansible_shell_executable' from source: unknown 22286 1726882779.46898: variable 'ansible_connection' from source: unknown 22286 1726882779.46902: variable 'ansible_module_compression' from source: unknown 22286 1726882779.46909: variable 'ansible_shell_type' from source: unknown 22286 1726882779.46912: variable 'ansible_shell_executable' from source: unknown 22286 1726882779.47005: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882779.47008: variable 'ansible_pipelining' from source: unknown 22286 1726882779.47011: variable 'ansible_timeout' from source: unknown 22286 1726882779.47015: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882779.47150: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 22286 1726882779.47167: variable 'omit' from source: magic vars 22286 1726882779.47246: starting attempt loop 22286 1726882779.47250: running the handler 22286 1726882779.47253: _low_level_execute_command(): starting 22286 1726882779.47255: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22286 1726882779.48059: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882779.48139: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882779.48206: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882779.48237: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882779.48264: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882779.48423: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882779.50936: stdout chunk (state=3): >>>/root <<< 22286 1726882779.51190: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882779.51194: stdout chunk (state=3): >>><<< 22286 1726882779.51196: stderr chunk (state=3): >>><<< 22286 1726882779.51219: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882779.51336: _low_level_execute_command(): starting 22286 1726882779.51340: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882779.512332-22393-207467933464277 `" && echo ansible-tmp-1726882779.512332-22393-207467933464277="` echo /root/.ansible/tmp/ansible-tmp-1726882779.512332-22393-207467933464277 `" ) && sleep 0' 22286 1726882779.51910: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882779.51919: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882779.51936: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882779.51977: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882779.52092: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882779.55065: stdout chunk (state=3): >>>ansible-tmp-1726882779.512332-22393-207467933464277=/root/.ansible/tmp/ansible-tmp-1726882779.512332-22393-207467933464277 <<< 22286 1726882779.55324: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882779.55328: stdout chunk (state=3): >>><<< 22286 1726882779.55330: stderr chunk (state=3): >>><<< 22286 1726882779.55541: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882779.512332-22393-207467933464277=/root/.ansible/tmp/ansible-tmp-1726882779.512332-22393-207467933464277 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882779.55544: variable 'ansible_module_compression' from source: unknown 22286 1726882779.55547: ANSIBALLZ: Using lock for stat 22286 1726882779.55549: ANSIBALLZ: Acquiring lock 22286 1726882779.55551: ANSIBALLZ: Lock acquired: 140212085117952 22286 1726882779.55554: ANSIBALLZ: Creating module 22286 1726882779.73444: ANSIBALLZ: Writing module into payload 22286 1726882779.73580: ANSIBALLZ: Writing module 22286 1726882779.73607: ANSIBALLZ: Renaming module 22286 1726882779.73620: ANSIBALLZ: Done creating module 22286 1726882779.73649: variable 'ansible_facts' from source: unknown 22286 1726882779.73737: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882779.512332-22393-207467933464277/AnsiballZ_stat.py 22286 1726882779.74024: Sending initial data 22286 1726882779.74027: Sent initial data (152 bytes) 22286 1726882779.74619: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882779.74700: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882779.74760: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882779.74776: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882779.74812: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882779.74979: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882779.77574: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22286 1726882779.77706: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22286 1726882779.77840: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22286nvkga28q/tmpb8vi9_n0 /root/.ansible/tmp/ansible-tmp-1726882779.512332-22393-207467933464277/AnsiballZ_stat.py <<< 22286 1726882779.77843: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882779.512332-22393-207467933464277/AnsiballZ_stat.py" <<< 22286 1726882779.77957: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-22286nvkga28q/tmpb8vi9_n0" to remote "/root/.ansible/tmp/ansible-tmp-1726882779.512332-22393-207467933464277/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882779.512332-22393-207467933464277/AnsiballZ_stat.py" <<< 22286 1726882779.79588: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882779.79592: stdout chunk (state=3): >>><<< 22286 1726882779.79594: stderr chunk (state=3): >>><<< 22286 1726882779.79597: done transferring module to remote 22286 1726882779.79599: _low_level_execute_command(): starting 22286 1726882779.79601: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882779.512332-22393-207467933464277/ /root/.ansible/tmp/ansible-tmp-1726882779.512332-22393-207467933464277/AnsiballZ_stat.py && sleep 0' 22286 1726882779.80174: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882779.80189: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882779.80251: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882779.80328: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882779.80345: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882779.80367: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882779.80525: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882779.83760: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882779.83764: stdout chunk (state=3): >>><<< 22286 1726882779.83766: stderr chunk (state=3): >>><<< 22286 1726882779.83769: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882779.83771: _low_level_execute_command(): starting 22286 1726882779.83774: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882779.512332-22393-207467933464277/AnsiballZ_stat.py && sleep 0' 22286 1726882779.85116: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882779.85120: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882779.85150: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882779.85161: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882779.85352: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882779.88415: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 22286 1726882779.88486: stdout chunk (state=3): >>>import _imp # builtin <<< 22286 1726882779.88527: stdout chunk (state=3): >>>import '_thread' # <<< 22286 1726882779.88554: stdout chunk (state=3): >>>import '_warnings' # import '_weakref' # <<< 22286 1726882779.88655: stdout chunk (state=3): >>>import '_io' # <<< 22286 1726882779.88677: stdout chunk (state=3): >>>import 'marshal' # <<< 22286 1726882779.88732: stdout chunk (state=3): >>>import 'posix' # <<< 22286 1726882779.88780: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 22286 1726882779.88821: stdout chunk (state=3): >>>import 'time' # <<< 22286 1726882779.88855: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 22286 1726882779.88922: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 22286 1726882779.88961: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 22286 1726882779.88979: stdout chunk (state=3): >>>import '_codecs' # <<< 22286 1726882779.89216: stdout chunk (state=3): >>>import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a8892c530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a888fbb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a8892eab0> import '_signal' # import '_abc' # <<< 22286 1726882779.89227: stdout chunk (state=3): >>>import 'abc' # <<< 22286 1726882779.89350: stdout chunk (state=3): >>>import 'io' # import '_stat' # import 'stat' # <<< 22286 1726882779.89449: stdout chunk (state=3): >>>import '_collections_abc' # <<< 22286 1726882779.89504: stdout chunk (state=3): >>>import 'genericpath' # <<< 22286 1726882779.89508: stdout chunk (state=3): >>>import 'posixpath' # <<< 22286 1726882779.89570: stdout chunk (state=3): >>>import 'os' # <<< 22286 1726882779.89574: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 22286 1726882779.89603: stdout chunk (state=3): >>>Processing user site-packages <<< 22286 1726882779.89652: stdout chunk (state=3): >>>Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' <<< 22286 1726882779.89678: stdout chunk (state=3): >>>Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 22286 1726882779.89727: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py <<< 22286 1726882779.89731: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 22286 1726882779.89845: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a886dd160> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 22286 1726882779.89892: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a886ddfd0> <<< 22286 1726882779.89903: stdout chunk (state=3): >>>import 'site' # <<< 22286 1726882779.89953: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 7 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 22286 1726882779.90357: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 22286 1726882779.90516: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 22286 1726882779.90527: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 22286 1726882779.90658: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a8871be90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 22286 1726882779.90727: stdout chunk (state=3): >>>import '_operator' # <<< 22286 1726882779.90791: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a8871bf50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 22286 1726882779.90813: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 22286 1726882779.90828: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 22286 1726882779.90928: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 22286 1726882779.90931: stdout chunk (state=3): >>>import 'itertools' # <<< 22286 1726882779.90987: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py <<< 22286 1726882779.90990: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' <<< 22286 1726882779.91156: stdout chunk (state=3): >>>import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a887538c0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a88753f50> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a88733b60> import '_functools' # <<< 22286 1726882779.91178: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a88731280> <<< 22286 1726882779.91323: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a88719040> <<< 22286 1726882779.91369: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 22286 1726882779.91415: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 22286 1726882779.91426: stdout chunk (state=3): >>>import '_sre' # <<< 22286 1726882779.91453: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 22286 1726882779.91496: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 22286 1726882779.91548: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py <<< 22286 1726882779.91551: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 22286 1726882779.91586: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a88777800> <<< 22286 1726882779.91614: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a88776420> <<< 22286 1726882779.91663: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py <<< 22286 1726882779.91683: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a88732150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a88774cb0> <<< 22286 1726882779.91786: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 22286 1726882779.91807: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a887a8890> <<< 22286 1726882779.91820: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a887182c0> <<< 22286 1726882779.91854: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py <<< 22286 1726882779.92080: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8a887a8d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a887a8bf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8a887a8fb0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a88716de0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 22286 1726882779.92090: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a887a9670> <<< 22286 1726882779.92116: stdout chunk (state=3): >>>import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a887a9340> import 'importlib.machinery' # <<< 22286 1726882779.92145: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py <<< 22286 1726882779.92175: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 22286 1726882779.92202: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a887aa510> <<< 22286 1726882779.92229: stdout chunk (state=3): >>>import 'importlib.util' # <<< 22286 1726882779.92244: stdout chunk (state=3): >>>import 'runpy' # <<< 22286 1726882779.92266: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 22286 1726882779.92308: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 22286 1726882779.92391: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a887c0740> <<< 22286 1726882779.92396: stdout chunk (state=3): >>>import 'errno' # <<< 22286 1726882779.92574: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8a887c1e20> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 22286 1726882779.92635: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a887c2cf0> <<< 22286 1726882779.92686: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' <<< 22286 1726882779.92878: stdout chunk (state=3): >>>import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8a887c3350> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a887c2270> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8a887c3dd0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a887c3500> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a887aa570> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 22286 1726882779.92881: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 22286 1726882779.92911: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 22286 1726882779.92941: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 22286 1726882779.92988: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' <<< 22286 1726882779.93085: stdout chunk (state=3): >>># extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8a8854bc80> <<< 22286 1726882779.93113: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8a88574740> <<< 22286 1726882779.93137: stdout chunk (state=3): >>>import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a885744a0> <<< 22286 1726882779.93241: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8a88574770> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8a88574950> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a88549e20> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 22286 1726882779.93405: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 22286 1726882779.93422: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 22286 1726882779.93535: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 22286 1726882779.93539: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a88575f70> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a88574bf0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a887aac60> <<< 22286 1726882779.93570: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 22286 1726882779.93654: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 22286 1726882779.93679: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 22286 1726882779.93853: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a885a6300> <<< 22286 1726882779.93871: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 22286 1726882779.93892: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 22286 1726882779.93949: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 22286 1726882779.93966: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 22286 1726882779.94041: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a885be480> <<< 22286 1726882779.94173: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 22286 1726882779.94176: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 22286 1726882779.94236: stdout chunk (state=3): >>>import 'ntpath' # <<< 22286 1726882779.94271: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py <<< 22286 1726882779.94300: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a885fb230> <<< 22286 1726882779.94468: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 22286 1726882779.94471: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 22286 1726882779.94612: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a8861d9d0> <<< 22286 1726882779.94725: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a885fb350> <<< 22286 1726882779.94789: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a885bf110> <<< 22286 1726882779.94831: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py <<< 22286 1726882779.94892: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a88440350> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a885bd4c0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a88576ea0> <<< 22286 1726882779.95045: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 22286 1726882779.95072: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f8a88440530> <<< 22286 1726882779.95250: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_v2l0suoq/ansible_stat_payload.zip' <<< 22286 1726882779.95446: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.95541: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882779.95555: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 22286 1726882779.95600: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 22286 1726882779.95724: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 22286 1726882779.95893: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' <<< 22286 1726882779.95897: stdout chunk (state=3): >>>import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a88495fd0> import '_typing' # <<< 22286 1726882779.96126: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a8846cec0> <<< 22286 1726882779.96348: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a88443f80> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available <<< 22286 1726882779.98751: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882780.00826: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py <<< 22286 1726882780.00894: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a8846fe60> <<< 22286 1726882780.00919: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 22286 1726882780.01109: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 22286 1726882780.01149: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8a884bd9a0> <<< 22286 1726882780.01167: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a884bd730> <<< 22286 1726882780.01363: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a884bd040> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a884bda60> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a88496c60> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' <<< 22286 1726882780.01385: stdout chunk (state=3): >>># extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8a884be720> <<< 22286 1726882780.01436: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' <<< 22286 1726882780.01476: stdout chunk (state=3): >>># extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8a884be8a0> <<< 22286 1726882780.01668: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 22286 1726882780.01689: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a884bedb0> <<< 22286 1726882780.01738: stdout chunk (state=3): >>>import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 22286 1726882780.01779: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 22286 1726882780.01838: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a88324c20> <<< 22286 1726882780.02159: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8a88326840> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a88327110> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a88327f50> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 22286 1726882780.02192: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 22286 1726882780.02215: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 22286 1726882780.02380: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a8832ad50> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' <<< 22286 1726882780.02414: stdout chunk (state=3): >>>import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8a8832ae40> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a88329010> <<< 22286 1726882780.02483: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 22286 1726882780.02540: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 22286 1726882780.02561: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 22286 1726882780.02652: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 22286 1726882780.02742: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 22286 1726882780.02745: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a8832ecc0> import '_tokenize' # <<< 22286 1726882780.02805: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a8832d790> <<< 22286 1726882780.02948: stdout chunk (state=3): >>>import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a8832d520> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 22286 1726882780.02998: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a8832fd10> <<< 22286 1726882780.03094: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a88329490> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' <<< 22286 1726882780.03129: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8a88376e10> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py <<< 22286 1726882780.03180: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a88376f30> <<< 22286 1726882780.03288: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 22286 1726882780.03321: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8a88378b00> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a883788c0> <<< 22286 1726882780.03446: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 22286 1726882780.03528: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 22286 1726882780.03590: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 22286 1726882780.03643: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8a8837b020> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a88379190> <<< 22286 1726882780.03760: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 22286 1726882780.03787: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 22286 1726882780.03803: stdout chunk (state=3): >>>import '_string' # <<< 22286 1726882780.03887: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a883827e0> <<< 22286 1726882780.04144: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a8837b170> <<< 22286 1726882780.04456: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8a883835f0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8a88383a40> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8a88383b30> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a883771d0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 22286 1726882780.04471: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 22286 1726882780.04529: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 22286 1726882780.04739: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 22286 1726882780.04743: stdout chunk (state=3): >>>import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8a88386b40> <<< 22286 1726882780.04908: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 22286 1726882780.04949: stdout chunk (state=3): >>>import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8a88387fe0> <<< 22286 1726882780.04980: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a88385310> <<< 22286 1726882780.05044: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' <<< 22286 1726882780.05065: stdout chunk (state=3): >>>import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8a88386660> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a88384e90> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # <<< 22286 1726882780.05201: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882780.05270: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882780.05425: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882780.05454: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882780.05472: stdout chunk (state=3): >>>import 'ansible.module_utils.common' # <<< 22286 1726882780.05498: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882780.05648: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 22286 1726882780.05779: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882780.06011: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882780.07165: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882780.08331: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 22286 1726882780.08463: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 22286 1726882780.08501: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so'<<< 22286 1726882780.08522: stdout chunk (state=3): >>> import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8a8840c230> <<< 22286 1726882780.08686: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py <<< 22286 1726882780.08719: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 22286 1726882780.08757: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a8840d190> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a8838b4a0> <<< 22286 1726882780.08826: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 22286 1726882780.09155: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available <<< 22286 1726882780.09199: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882780.09512: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 22286 1726882780.09530: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 22286 1726882780.09557: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a8840d7f0> <<< 22286 1726882780.09589: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882780.10647: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882780.11509: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882780.11638: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882780.11770: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 22286 1726882780.11800: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882780.11859: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882780.11962: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 22286 1726882780.12172: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882780.12277: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 22286 1726882780.12317: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882780.12384: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 22286 1726882780.12427: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882780.12476: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 22286 1726882780.12509: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882780.12965: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882780.13475: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 22286 1726882780.13565: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 22286 1726882780.13645: stdout chunk (state=3): >>>import '_ast' # <<< 22286 1726882780.13721: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a8840fa40> <<< 22286 1726882780.13754: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882780.13879: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882780.14037: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # <<< 22286 1726882780.14056: stdout chunk (state=3): >>>import 'ansible.module_utils.common.parameters' # <<< 22286 1726882780.14071: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 22286 1726882780.14260: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 22286 1726882780.14372: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 22286 1726882780.14401: stdout chunk (state=3): >>>import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8a8821ddf0> <<< 22286 1726882780.14455: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 22286 1726882780.14487: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8a8821e7b0> <<< 22286 1726882780.14500: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a8840e7b0> <<< 22286 1726882780.14527: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882780.14598: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882780.14676: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 22286 1726882780.14686: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882780.14770: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882780.14828: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882780.14935: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882780.15053: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 22286 1726882780.15115: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 22286 1726882780.15262: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' <<< 22286 1726882780.15266: stdout chunk (state=3): >>># extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8a8821d3a0> <<< 22286 1726882780.15333: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a8821db50> <<< 22286 1726882780.15443: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 22286 1726882780.15511: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882780.15608: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882780.15667: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882780.15758: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py <<< 22286 1726882780.15762: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 22286 1726882780.15796: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 22286 1726882780.15819: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 22286 1726882780.16057: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 22286 1726882780.16069: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a882ae930> <<< 22286 1726882780.16123: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a882285c0> <<< 22286 1726882780.16301: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a882226f0> <<< 22286 1726882780.16305: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a88222540> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 22286 1726882780.16307: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882780.16357: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882780.16392: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 22286 1726882780.16662: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 22286 1726882780.16835: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882780.17157: stdout chunk (state=3): >>># zipimport: zlib available <<< 22286 1726882780.17337: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 22286 1726882780.17350: stdout chunk (state=3): >>># destroy __main__ <<< 22286 1726882780.17800: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 22286 1726882780.17856: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv<<< 22286 1726882780.17859: stdout chunk (state=3): >>> # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type<<< 22286 1726882780.17862: stdout chunk (state=3): >>> # clear sys.last_value # clear sys.last_traceback<<< 22286 1726882780.17865: stdout chunk (state=3): >>> # clear sys.__interactivehook__ # clear sys.meta_path<<< 22286 1726882780.18072: stdout chunk (state=3): >>> # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] remo<<< 22286 1726882780.18098: stdout chunk (state=3): >>>ving selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 22286 1726882780.18395: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 22286 1726882780.18420: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma <<< 22286 1726882780.18473: stdout chunk (state=3): >>># destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress <<< 22286 1726882780.18497: stdout chunk (state=3): >>># destroy ntpath # destroy importlib # destroy zipimport <<< 22286 1726882780.18538: stdout chunk (state=3): >>># destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd <<< 22286 1726882780.18547: stdout chunk (state=3): >>># destroy locale # destroy signal # destroy fcntl # destroy select <<< 22286 1726882780.18585: stdout chunk (state=3): >>># destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array <<< 22286 1726882780.18614: stdout chunk (state=3): >>># destroy datetime # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil <<< 22286 1726882780.18668: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux <<< 22286 1726882780.18748: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform <<< 22286 1726882780.18752: stdout chunk (state=3): >>># cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading <<< 22286 1726882780.18754: stdout chunk (state=3): >>># cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external <<< 22286 1726882780.18855: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser <<< 22286 1726882780.18882: stdout chunk (state=3): >>># cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 22286 1726882780.18980: stdout chunk (state=3): >>># destroy sys.monitoring <<< 22286 1726882780.19028: stdout chunk (state=3): >>># destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 22286 1726882780.19082: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 22286 1726882780.19086: stdout chunk (state=3): >>># destroy _typing <<< 22286 1726882780.19121: stdout chunk (state=3): >>># destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 22286 1726882780.19140: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 22286 1726882780.19238: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect <<< 22286 1726882780.19267: stdout chunk (state=3): >>># destroy time # destroy _random # destroy _weakref <<< 22286 1726882780.19340: stdout chunk (state=3): >>># destroy _operator # destroy _sha2 # destroy _string # destroy re # destroy itertools <<< 22286 1726882780.19343: stdout chunk (state=3): >>># destroy _abc <<< 22286 1726882780.19374: stdout chunk (state=3): >>># destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 22286 1726882780.19947: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. <<< 22286 1726882780.19971: stderr chunk (state=3): >>><<< 22286 1726882780.20064: stdout chunk (state=3): >>><<< 22286 1726882780.20294: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a8892c530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a888fbb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a8892eab0> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a886dd160> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a886ddfd0> import 'site' # Python 3.12.5 (main, Aug 7 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a8871be90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a8871bf50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a887538c0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a88753f50> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a88733b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a88731280> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a88719040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a88777800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a88776420> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a88732150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a88774cb0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a887a8890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a887182c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8a887a8d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a887a8bf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8a887a8fb0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a88716de0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a887a9670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a887a9340> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a887aa510> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a887c0740> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8a887c1e20> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a887c2cf0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8a887c3350> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a887c2270> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8a887c3dd0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a887c3500> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a887aa570> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8a8854bc80> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8a88574740> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a885744a0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8a88574770> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8a88574950> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a88549e20> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a88575f70> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a88574bf0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a887aac60> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a885a6300> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a885be480> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a885fb230> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a8861d9d0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a885fb350> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a885bf110> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a88440350> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a885bd4c0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a88576ea0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f8a88440530> # zipimport: found 30 names in '/tmp/ansible_stat_payload_v2l0suoq/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a88495fd0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a8846cec0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a88443f80> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a8846fe60> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8a884bd9a0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a884bd730> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a884bd040> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a884bda60> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a88496c60> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8a884be720> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8a884be8a0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a884bedb0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a88324c20> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8a88326840> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a88327110> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a88327f50> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a8832ad50> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8a8832ae40> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a88329010> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a8832ecc0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a8832d790> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a8832d520> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a8832fd10> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a88329490> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8a88376e10> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a88376f30> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8a88378b00> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a883788c0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8a8837b020> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a88379190> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a883827e0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a8837b170> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8a883835f0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8a88383a40> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8a88383b30> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a883771d0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8a88386b40> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8a88387fe0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a88385310> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8a88386660> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a88384e90> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8a8840c230> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a8840d190> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a8838b4a0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a8840d7f0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a8840fa40> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8a8821ddf0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8a8821e7b0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a8840e7b0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8a8821d3a0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a8821db50> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a882ae930> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a882285c0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a882226f0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8a88222540> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 22286 1726882780.21940: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882779.512332-22393-207467933464277/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22286 1726882780.22056: _low_level_execute_command(): starting 22286 1726882780.22059: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882779.512332-22393-207467933464277/ > /dev/null 2>&1 && sleep 0' 22286 1726882780.22120: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882780.22140: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882780.22179: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882780.22199: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22286 1726882780.22218: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 <<< 22286 1726882780.22265: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882780.22380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882780.22383: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882780.22463: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882780.22560: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882780.22706: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882780.25607: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882780.25611: stdout chunk (state=3): >>><<< 22286 1726882780.25613: stderr chunk (state=3): >>><<< 22286 1726882780.25629: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882780.25840: handler run complete 22286 1726882780.25843: attempt loop complete, returning result 22286 1726882780.25846: _execute() done 22286 1726882780.25848: dumping result to json 22286 1726882780.25850: done dumping result, returning 22286 1726882780.25856: done running TaskExecutor() for managed_node3/TASK: Check if system is ostree [0affe814-3a2d-a75d-4836-0000000000cc] 22286 1726882780.25858: sending task result for task 0affe814-3a2d-a75d-4836-0000000000cc 22286 1726882780.25930: done sending task result for task 0affe814-3a2d-a75d-4836-0000000000cc 22286 1726882780.25935: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 22286 1726882780.26016: no more pending results, returning what we have 22286 1726882780.26019: results queue empty 22286 1726882780.26020: checking for any_errors_fatal 22286 1726882780.26028: done checking for any_errors_fatal 22286 1726882780.26029: checking for max_fail_percentage 22286 1726882780.26030: done checking for max_fail_percentage 22286 1726882780.26031: checking to see if all hosts have failed and the running result is not ok 22286 1726882780.26032: done checking to see if all hosts have failed 22286 1726882780.26033: getting the remaining hosts for this loop 22286 1726882780.26044: done getting the remaining hosts for this loop 22286 1726882780.26049: getting the next task for host managed_node3 22286 1726882780.26055: done getting next task for host managed_node3 22286 1726882780.26058: ^ task is: TASK: Set flag to indicate system is ostree 22286 1726882780.26061: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882780.26064: getting variables 22286 1726882780.26066: in VariableManager get_vars() 22286 1726882780.26097: Calling all_inventory to load vars for managed_node3 22286 1726882780.26100: Calling groups_inventory to load vars for managed_node3 22286 1726882780.26103: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882780.26197: Calling all_plugins_play to load vars for managed_node3 22286 1726882780.26201: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882780.26205: Calling groups_plugins_play to load vars for managed_node3 22286 1726882780.26404: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882780.26779: done with get_vars() 22286 1726882780.26793: done getting variables 22286 1726882780.26986: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Friday 20 September 2024 21:39:40 -0400 (0:00:00.834) 0:00:03.663 ****** 22286 1726882780.27189: entering _queue_task() for managed_node3/set_fact 22286 1726882780.27193: Creating lock for set_fact 22286 1726882780.27737: worker is 1 (out of 1 available) 22286 1726882780.27975: exiting _queue_task() for managed_node3/set_fact 22286 1726882780.27988: done queuing things up, now waiting for results queue to drain 22286 1726882780.27990: waiting for pending results... 22286 1726882780.28109: running TaskExecutor() for managed_node3/TASK: Set flag to indicate system is ostree 22286 1726882780.28181: in run() - task 0affe814-3a2d-a75d-4836-0000000000cd 22286 1726882780.28193: variable 'ansible_search_path' from source: unknown 22286 1726882780.28209: variable 'ansible_search_path' from source: unknown 22286 1726882780.28231: calling self._execute() 22286 1726882780.28296: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882780.28302: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882780.28318: variable 'omit' from source: magic vars 22286 1726882780.28768: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22286 1726882780.28960: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22286 1726882780.29002: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22286 1726882780.29030: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22286 1726882780.29060: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22286 1726882780.29135: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22286 1726882780.29156: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22286 1726882780.29180: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882780.29206: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22286 1726882780.29307: Evaluated conditional (not __network_is_ostree is defined): True 22286 1726882780.29311: variable 'omit' from source: magic vars 22286 1726882780.29346: variable 'omit' from source: magic vars 22286 1726882780.29445: variable '__ostree_booted_stat' from source: set_fact 22286 1726882780.29485: variable 'omit' from source: magic vars 22286 1726882780.29507: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22286 1726882780.29536: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22286 1726882780.29553: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22286 1726882780.29569: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882780.29580: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882780.29605: variable 'inventory_hostname' from source: host vars for 'managed_node3' 22286 1726882780.29608: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882780.29613: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882780.29700: Set connection var ansible_shell_executable to /bin/sh 22286 1726882780.29708: Set connection var ansible_module_compression to ZIP_DEFLATED 22286 1726882780.29711: Set connection var ansible_connection to ssh 22286 1726882780.29716: Set connection var ansible_shell_type to sh 22286 1726882780.29726: Set connection var ansible_timeout to 10 22286 1726882780.29736: Set connection var ansible_pipelining to False 22286 1726882780.29759: variable 'ansible_shell_executable' from source: unknown 22286 1726882780.29762: variable 'ansible_connection' from source: unknown 22286 1726882780.29765: variable 'ansible_module_compression' from source: unknown 22286 1726882780.29767: variable 'ansible_shell_type' from source: unknown 22286 1726882780.29772: variable 'ansible_shell_executable' from source: unknown 22286 1726882780.29777: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882780.29780: variable 'ansible_pipelining' from source: unknown 22286 1726882780.29785: variable 'ansible_timeout' from source: unknown 22286 1726882780.29790: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882780.30008: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22286 1726882780.30012: variable 'omit' from source: magic vars 22286 1726882780.30015: starting attempt loop 22286 1726882780.30017: running the handler 22286 1726882780.30020: handler run complete 22286 1726882780.30022: attempt loop complete, returning result 22286 1726882780.30024: _execute() done 22286 1726882780.30026: dumping result to json 22286 1726882780.30028: done dumping result, returning 22286 1726882780.30030: done running TaskExecutor() for managed_node3/TASK: Set flag to indicate system is ostree [0affe814-3a2d-a75d-4836-0000000000cd] 22286 1726882780.30032: sending task result for task 0affe814-3a2d-a75d-4836-0000000000cd 22286 1726882780.30099: done sending task result for task 0affe814-3a2d-a75d-4836-0000000000cd 22286 1726882780.30102: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 22286 1726882780.30215: no more pending results, returning what we have 22286 1726882780.30219: results queue empty 22286 1726882780.30220: checking for any_errors_fatal 22286 1726882780.30226: done checking for any_errors_fatal 22286 1726882780.30227: checking for max_fail_percentage 22286 1726882780.30230: done checking for max_fail_percentage 22286 1726882780.30231: checking to see if all hosts have failed and the running result is not ok 22286 1726882780.30232: done checking to see if all hosts have failed 22286 1726882780.30233: getting the remaining hosts for this loop 22286 1726882780.30237: done getting the remaining hosts for this loop 22286 1726882780.30241: getting the next task for host managed_node3 22286 1726882780.30252: done getting next task for host managed_node3 22286 1726882780.30257: ^ task is: TASK: Fix CentOS6 Base repo 22286 1726882780.30261: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882780.30265: getting variables 22286 1726882780.30268: in VariableManager get_vars() 22286 1726882780.30302: Calling all_inventory to load vars for managed_node3 22286 1726882780.30306: Calling groups_inventory to load vars for managed_node3 22286 1726882780.30310: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882780.30322: Calling all_plugins_play to load vars for managed_node3 22286 1726882780.30326: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882780.30482: Calling groups_plugins_play to load vars for managed_node3 22286 1726882780.30818: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882780.31192: done with get_vars() 22286 1726882780.31203: done getting variables 22286 1726882780.31370: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Friday 20 September 2024 21:39:40 -0400 (0:00:00.043) 0:00:03.707 ****** 22286 1726882780.31413: entering _queue_task() for managed_node3/copy 22286 1726882780.31722: worker is 1 (out of 1 available) 22286 1726882780.31852: exiting _queue_task() for managed_node3/copy 22286 1726882780.31864: done queuing things up, now waiting for results queue to drain 22286 1726882780.31866: waiting for pending results... 22286 1726882780.32274: running TaskExecutor() for managed_node3/TASK: Fix CentOS6 Base repo 22286 1726882780.32357: in run() - task 0affe814-3a2d-a75d-4836-0000000000cf 22286 1726882780.32392: variable 'ansible_search_path' from source: unknown 22286 1726882780.32396: variable 'ansible_search_path' from source: unknown 22286 1726882780.32438: calling self._execute() 22286 1726882780.32505: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882780.32515: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882780.32525: variable 'omit' from source: magic vars 22286 1726882780.32924: variable 'ansible_distribution' from source: facts 22286 1726882780.32948: Evaluated conditional (ansible_distribution == 'CentOS'): False 22286 1726882780.32951: when evaluation is False, skipping this task 22286 1726882780.32954: _execute() done 22286 1726882780.32959: dumping result to json 22286 1726882780.32964: done dumping result, returning 22286 1726882780.32971: done running TaskExecutor() for managed_node3/TASK: Fix CentOS6 Base repo [0affe814-3a2d-a75d-4836-0000000000cf] 22286 1726882780.32980: sending task result for task 0affe814-3a2d-a75d-4836-0000000000cf 22286 1726882780.33077: done sending task result for task 0affe814-3a2d-a75d-4836-0000000000cf 22286 1726882780.33081: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution == 'CentOS'", "skip_reason": "Conditional result was False" } 22286 1726882780.33149: no more pending results, returning what we have 22286 1726882780.33152: results queue empty 22286 1726882780.33153: checking for any_errors_fatal 22286 1726882780.33157: done checking for any_errors_fatal 22286 1726882780.33158: checking for max_fail_percentage 22286 1726882780.33159: done checking for max_fail_percentage 22286 1726882780.33160: checking to see if all hosts have failed and the running result is not ok 22286 1726882780.33161: done checking to see if all hosts have failed 22286 1726882780.33162: getting the remaining hosts for this loop 22286 1726882780.33163: done getting the remaining hosts for this loop 22286 1726882780.33167: getting the next task for host managed_node3 22286 1726882780.33172: done getting next task for host managed_node3 22286 1726882780.33177: ^ task is: TASK: Include the task 'enable_epel.yml' 22286 1726882780.33180: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882780.33184: getting variables 22286 1726882780.33185: in VariableManager get_vars() 22286 1726882780.33209: Calling all_inventory to load vars for managed_node3 22286 1726882780.33211: Calling groups_inventory to load vars for managed_node3 22286 1726882780.33214: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882780.33221: Calling all_plugins_play to load vars for managed_node3 22286 1726882780.33223: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882780.33225: Calling groups_plugins_play to load vars for managed_node3 22286 1726882780.33374: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882780.33574: done with get_vars() 22286 1726882780.33584: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Friday 20 September 2024 21:39:40 -0400 (0:00:00.022) 0:00:03.729 ****** 22286 1726882780.33655: entering _queue_task() for managed_node3/include_tasks 22286 1726882780.33841: worker is 1 (out of 1 available) 22286 1726882780.33855: exiting _queue_task() for managed_node3/include_tasks 22286 1726882780.33869: done queuing things up, now waiting for results queue to drain 22286 1726882780.33870: waiting for pending results... 22286 1726882780.34011: running TaskExecutor() for managed_node3/TASK: Include the task 'enable_epel.yml' 22286 1726882780.34115: in run() - task 0affe814-3a2d-a75d-4836-0000000000d0 22286 1726882780.34119: variable 'ansible_search_path' from source: unknown 22286 1726882780.34124: variable 'ansible_search_path' from source: unknown 22286 1726882780.34161: calling self._execute() 22286 1726882780.34234: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882780.34238: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882780.34249: variable 'omit' from source: magic vars 22286 1726882780.34840: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22286 1726882780.37099: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22286 1726882780.37207: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22286 1726882780.37211: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22286 1726882780.37380: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22286 1726882780.37383: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22286 1726882780.37387: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22286 1726882780.37410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22286 1726882780.37461: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882780.37536: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22286 1726882780.37567: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22286 1726882780.37736: variable '__network_is_ostree' from source: set_fact 22286 1726882780.37770: Evaluated conditional (not __network_is_ostree | d(false)): True 22286 1726882780.37783: _execute() done 22286 1726882780.37791: dumping result to json 22286 1726882780.37799: done dumping result, returning 22286 1726882780.37813: done running TaskExecutor() for managed_node3/TASK: Include the task 'enable_epel.yml' [0affe814-3a2d-a75d-4836-0000000000d0] 22286 1726882780.37825: sending task result for task 0affe814-3a2d-a75d-4836-0000000000d0 22286 1726882780.37966: no more pending results, returning what we have 22286 1726882780.37971: in VariableManager get_vars() 22286 1726882780.38011: Calling all_inventory to load vars for managed_node3 22286 1726882780.38015: Calling groups_inventory to load vars for managed_node3 22286 1726882780.38019: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882780.38031: Calling all_plugins_play to load vars for managed_node3 22286 1726882780.38038: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882780.38044: Calling groups_plugins_play to load vars for managed_node3 22286 1726882780.38319: done sending task result for task 0affe814-3a2d-a75d-4836-0000000000d0 22286 1726882780.38323: WORKER PROCESS EXITING 22286 1726882780.38341: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882780.38528: done with get_vars() 22286 1726882780.38539: variable 'ansible_search_path' from source: unknown 22286 1726882780.38541: variable 'ansible_search_path' from source: unknown 22286 1726882780.38569: we have included files to process 22286 1726882780.38570: generating all_blocks data 22286 1726882780.38571: done generating all_blocks data 22286 1726882780.38578: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 22286 1726882780.38579: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 22286 1726882780.38582: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 22286 1726882780.39152: done processing included file 22286 1726882780.39154: iterating over new_blocks loaded from include file 22286 1726882780.39155: in VariableManager get_vars() 22286 1726882780.39164: done with get_vars() 22286 1726882780.39166: filtering new block on tags 22286 1726882780.39183: done filtering new block on tags 22286 1726882780.39185: in VariableManager get_vars() 22286 1726882780.39193: done with get_vars() 22286 1726882780.39194: filtering new block on tags 22286 1726882780.39202: done filtering new block on tags 22286 1726882780.39204: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node3 22286 1726882780.39208: extending task lists for all hosts with included blocks 22286 1726882780.39290: done extending task lists 22286 1726882780.39291: done processing included files 22286 1726882780.39292: results queue empty 22286 1726882780.39292: checking for any_errors_fatal 22286 1726882780.39295: done checking for any_errors_fatal 22286 1726882780.39295: checking for max_fail_percentage 22286 1726882780.39296: done checking for max_fail_percentage 22286 1726882780.39297: checking to see if all hosts have failed and the running result is not ok 22286 1726882780.39297: done checking to see if all hosts have failed 22286 1726882780.39298: getting the remaining hosts for this loop 22286 1726882780.39299: done getting the remaining hosts for this loop 22286 1726882780.39301: getting the next task for host managed_node3 22286 1726882780.39304: done getting next task for host managed_node3 22286 1726882780.39306: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 22286 1726882780.39308: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882780.39310: getting variables 22286 1726882780.39311: in VariableManager get_vars() 22286 1726882780.39317: Calling all_inventory to load vars for managed_node3 22286 1726882780.39318: Calling groups_inventory to load vars for managed_node3 22286 1726882780.39320: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882780.39324: Calling all_plugins_play to load vars for managed_node3 22286 1726882780.39329: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882780.39332: Calling groups_plugins_play to load vars for managed_node3 22286 1726882780.39471: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882780.39642: done with get_vars() 22286 1726882780.39649: done getting variables 22286 1726882780.39706: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 22286 1726882780.39865: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 39] ********************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Friday 20 September 2024 21:39:40 -0400 (0:00:00.062) 0:00:03.792 ****** 22286 1726882780.39905: entering _queue_task() for managed_node3/command 22286 1726882780.39907: Creating lock for command 22286 1726882780.40098: worker is 1 (out of 1 available) 22286 1726882780.40111: exiting _queue_task() for managed_node3/command 22286 1726882780.40123: done queuing things up, now waiting for results queue to drain 22286 1726882780.40124: waiting for pending results... 22286 1726882780.40450: running TaskExecutor() for managed_node3/TASK: Create EPEL 39 22286 1726882780.40455: in run() - task 0affe814-3a2d-a75d-4836-0000000000ea 22286 1726882780.40458: variable 'ansible_search_path' from source: unknown 22286 1726882780.40460: variable 'ansible_search_path' from source: unknown 22286 1726882780.40505: calling self._execute() 22286 1726882780.40595: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882780.40602: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882780.40615: variable 'omit' from source: magic vars 22286 1726882780.41085: variable 'ansible_distribution' from source: facts 22286 1726882780.41116: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 22286 1726882780.41124: when evaluation is False, skipping this task 22286 1726882780.41132: _execute() done 22286 1726882780.41144: dumping result to json 22286 1726882780.41153: done dumping result, returning 22286 1726882780.41164: done running TaskExecutor() for managed_node3/TASK: Create EPEL 39 [0affe814-3a2d-a75d-4836-0000000000ea] 22286 1726882780.41175: sending task result for task 0affe814-3a2d-a75d-4836-0000000000ea 22286 1726882780.41324: done sending task result for task 0affe814-3a2d-a75d-4836-0000000000ea skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 22286 1726882780.41377: no more pending results, returning what we have 22286 1726882780.41381: results queue empty 22286 1726882780.41382: checking for any_errors_fatal 22286 1726882780.41383: done checking for any_errors_fatal 22286 1726882780.41384: checking for max_fail_percentage 22286 1726882780.41386: done checking for max_fail_percentage 22286 1726882780.41387: checking to see if all hosts have failed and the running result is not ok 22286 1726882780.41388: done checking to see if all hosts have failed 22286 1726882780.41388: getting the remaining hosts for this loop 22286 1726882780.41390: done getting the remaining hosts for this loop 22286 1726882780.41393: getting the next task for host managed_node3 22286 1726882780.41399: done getting next task for host managed_node3 22286 1726882780.41402: ^ task is: TASK: Install yum-utils package 22286 1726882780.41406: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882780.41409: getting variables 22286 1726882780.41410: in VariableManager get_vars() 22286 1726882780.41466: Calling all_inventory to load vars for managed_node3 22286 1726882780.41469: Calling groups_inventory to load vars for managed_node3 22286 1726882780.41471: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882780.41480: Calling all_plugins_play to load vars for managed_node3 22286 1726882780.41482: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882780.41486: Calling groups_plugins_play to load vars for managed_node3 22286 1726882780.41663: WORKER PROCESS EXITING 22286 1726882780.41675: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882780.41879: done with get_vars() 22286 1726882780.41887: done getting variables 22286 1726882780.41961: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Friday 20 September 2024 21:39:40 -0400 (0:00:00.020) 0:00:03.813 ****** 22286 1726882780.41987: entering _queue_task() for managed_node3/package 22286 1726882780.41989: Creating lock for package 22286 1726882780.42171: worker is 1 (out of 1 available) 22286 1726882780.42185: exiting _queue_task() for managed_node3/package 22286 1726882780.42198: done queuing things up, now waiting for results queue to drain 22286 1726882780.42199: waiting for pending results... 22286 1726882780.42341: running TaskExecutor() for managed_node3/TASK: Install yum-utils package 22286 1726882780.42432: in run() - task 0affe814-3a2d-a75d-4836-0000000000eb 22286 1726882780.42447: variable 'ansible_search_path' from source: unknown 22286 1726882780.42450: variable 'ansible_search_path' from source: unknown 22286 1726882780.42476: calling self._execute() 22286 1726882780.42541: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882780.42545: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882780.42558: variable 'omit' from source: magic vars 22286 1726882780.42857: variable 'ansible_distribution' from source: facts 22286 1726882780.42870: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 22286 1726882780.42874: when evaluation is False, skipping this task 22286 1726882780.42879: _execute() done 22286 1726882780.42881: dumping result to json 22286 1726882780.42885: done dumping result, returning 22286 1726882780.42895: done running TaskExecutor() for managed_node3/TASK: Install yum-utils package [0affe814-3a2d-a75d-4836-0000000000eb] 22286 1726882780.42899: sending task result for task 0affe814-3a2d-a75d-4836-0000000000eb skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 22286 1726882780.43039: no more pending results, returning what we have 22286 1726882780.43042: results queue empty 22286 1726882780.43043: checking for any_errors_fatal 22286 1726882780.43049: done checking for any_errors_fatal 22286 1726882780.43050: checking for max_fail_percentage 22286 1726882780.43052: done checking for max_fail_percentage 22286 1726882780.43053: checking to see if all hosts have failed and the running result is not ok 22286 1726882780.43054: done checking to see if all hosts have failed 22286 1726882780.43055: getting the remaining hosts for this loop 22286 1726882780.43056: done getting the remaining hosts for this loop 22286 1726882780.43060: getting the next task for host managed_node3 22286 1726882780.43066: done getting next task for host managed_node3 22286 1726882780.43068: ^ task is: TASK: Enable EPEL 7 22286 1726882780.43072: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882780.43075: getting variables 22286 1726882780.43076: in VariableManager get_vars() 22286 1726882780.43103: Calling all_inventory to load vars for managed_node3 22286 1726882780.43106: Calling groups_inventory to load vars for managed_node3 22286 1726882780.43109: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882780.43117: Calling all_plugins_play to load vars for managed_node3 22286 1726882780.43120: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882780.43123: Calling groups_plugins_play to load vars for managed_node3 22286 1726882780.43268: done sending task result for task 0affe814-3a2d-a75d-4836-0000000000eb 22286 1726882780.43272: WORKER PROCESS EXITING 22286 1726882780.43285: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882780.43465: done with get_vars() 22286 1726882780.43472: done getting variables 22286 1726882780.43514: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Friday 20 September 2024 21:39:40 -0400 (0:00:00.015) 0:00:03.828 ****** 22286 1726882780.43538: entering _queue_task() for managed_node3/command 22286 1726882780.43716: worker is 1 (out of 1 available) 22286 1726882780.43730: exiting _queue_task() for managed_node3/command 22286 1726882780.43744: done queuing things up, now waiting for results queue to drain 22286 1726882780.43746: waiting for pending results... 22286 1726882780.43892: running TaskExecutor() for managed_node3/TASK: Enable EPEL 7 22286 1726882780.43967: in run() - task 0affe814-3a2d-a75d-4836-0000000000ec 22286 1726882780.43978: variable 'ansible_search_path' from source: unknown 22286 1726882780.43982: variable 'ansible_search_path' from source: unknown 22286 1726882780.44015: calling self._execute() 22286 1726882780.44074: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882780.44083: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882780.44101: variable 'omit' from source: magic vars 22286 1726882780.44400: variable 'ansible_distribution' from source: facts 22286 1726882780.44411: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 22286 1726882780.44415: when evaluation is False, skipping this task 22286 1726882780.44419: _execute() done 22286 1726882780.44422: dumping result to json 22286 1726882780.44432: done dumping result, returning 22286 1726882780.44437: done running TaskExecutor() for managed_node3/TASK: Enable EPEL 7 [0affe814-3a2d-a75d-4836-0000000000ec] 22286 1726882780.44444: sending task result for task 0affe814-3a2d-a75d-4836-0000000000ec 22286 1726882780.44532: done sending task result for task 0affe814-3a2d-a75d-4836-0000000000ec 22286 1726882780.44536: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 22286 1726882780.44588: no more pending results, returning what we have 22286 1726882780.44591: results queue empty 22286 1726882780.44592: checking for any_errors_fatal 22286 1726882780.44598: done checking for any_errors_fatal 22286 1726882780.44599: checking for max_fail_percentage 22286 1726882780.44600: done checking for max_fail_percentage 22286 1726882780.44601: checking to see if all hosts have failed and the running result is not ok 22286 1726882780.44602: done checking to see if all hosts have failed 22286 1726882780.44603: getting the remaining hosts for this loop 22286 1726882780.44605: done getting the remaining hosts for this loop 22286 1726882780.44608: getting the next task for host managed_node3 22286 1726882780.44614: done getting next task for host managed_node3 22286 1726882780.44616: ^ task is: TASK: Enable EPEL 8 22286 1726882780.44620: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882780.44623: getting variables 22286 1726882780.44625: in VariableManager get_vars() 22286 1726882780.44658: Calling all_inventory to load vars for managed_node3 22286 1726882780.44660: Calling groups_inventory to load vars for managed_node3 22286 1726882780.44662: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882780.44669: Calling all_plugins_play to load vars for managed_node3 22286 1726882780.44672: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882780.44674: Calling groups_plugins_play to load vars for managed_node3 22286 1726882780.44852: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882780.45031: done with get_vars() 22286 1726882780.45041: done getting variables 22286 1726882780.45090: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Friday 20 September 2024 21:39:40 -0400 (0:00:00.015) 0:00:03.844 ****** 22286 1726882780.45112: entering _queue_task() for managed_node3/command 22286 1726882780.45295: worker is 1 (out of 1 available) 22286 1726882780.45306: exiting _queue_task() for managed_node3/command 22286 1726882780.45320: done queuing things up, now waiting for results queue to drain 22286 1726882780.45322: waiting for pending results... 22286 1726882780.45478: running TaskExecutor() for managed_node3/TASK: Enable EPEL 8 22286 1726882780.45560: in run() - task 0affe814-3a2d-a75d-4836-0000000000ed 22286 1726882780.45640: variable 'ansible_search_path' from source: unknown 22286 1726882780.45644: variable 'ansible_search_path' from source: unknown 22286 1726882780.45647: calling self._execute() 22286 1726882780.45672: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882780.45687: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882780.45698: variable 'omit' from source: magic vars 22286 1726882780.46006: variable 'ansible_distribution' from source: facts 22286 1726882780.46024: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 22286 1726882780.46028: when evaluation is False, skipping this task 22286 1726882780.46031: _execute() done 22286 1726882780.46036: dumping result to json 22286 1726882780.46038: done dumping result, returning 22286 1726882780.46047: done running TaskExecutor() for managed_node3/TASK: Enable EPEL 8 [0affe814-3a2d-a75d-4836-0000000000ed] 22286 1726882780.46052: sending task result for task 0affe814-3a2d-a75d-4836-0000000000ed 22286 1726882780.46144: done sending task result for task 0affe814-3a2d-a75d-4836-0000000000ed 22286 1726882780.46147: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 22286 1726882780.46197: no more pending results, returning what we have 22286 1726882780.46201: results queue empty 22286 1726882780.46202: checking for any_errors_fatal 22286 1726882780.46207: done checking for any_errors_fatal 22286 1726882780.46208: checking for max_fail_percentage 22286 1726882780.46210: done checking for max_fail_percentage 22286 1726882780.46211: checking to see if all hosts have failed and the running result is not ok 22286 1726882780.46212: done checking to see if all hosts have failed 22286 1726882780.46213: getting the remaining hosts for this loop 22286 1726882780.46214: done getting the remaining hosts for this loop 22286 1726882780.46217: getting the next task for host managed_node3 22286 1726882780.46226: done getting next task for host managed_node3 22286 1726882780.46229: ^ task is: TASK: Enable EPEL 6 22286 1726882780.46233: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882780.46238: getting variables 22286 1726882780.46239: in VariableManager get_vars() 22286 1726882780.46267: Calling all_inventory to load vars for managed_node3 22286 1726882780.46270: Calling groups_inventory to load vars for managed_node3 22286 1726882780.46274: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882780.46283: Calling all_plugins_play to load vars for managed_node3 22286 1726882780.46286: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882780.46289: Calling groups_plugins_play to load vars for managed_node3 22286 1726882780.46438: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882780.46616: done with get_vars() 22286 1726882780.46625: done getting variables 22286 1726882780.46669: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Friday 20 September 2024 21:39:40 -0400 (0:00:00.015) 0:00:03.860 ****** 22286 1726882780.46691: entering _queue_task() for managed_node3/copy 22286 1726882780.46890: worker is 1 (out of 1 available) 22286 1726882780.46904: exiting _queue_task() for managed_node3/copy 22286 1726882780.46916: done queuing things up, now waiting for results queue to drain 22286 1726882780.46918: waiting for pending results... 22286 1726882780.47067: running TaskExecutor() for managed_node3/TASK: Enable EPEL 6 22286 1726882780.47143: in run() - task 0affe814-3a2d-a75d-4836-0000000000ef 22286 1726882780.47160: variable 'ansible_search_path' from source: unknown 22286 1726882780.47163: variable 'ansible_search_path' from source: unknown 22286 1726882780.47195: calling self._execute() 22286 1726882780.47257: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882780.47261: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882780.47276: variable 'omit' from source: magic vars 22286 1726882780.47827: variable 'ansible_distribution' from source: facts 22286 1726882780.47838: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 22286 1726882780.47841: when evaluation is False, skipping this task 22286 1726882780.47844: _execute() done 22286 1726882780.47849: dumping result to json 22286 1726882780.47854: done dumping result, returning 22286 1726882780.47861: done running TaskExecutor() for managed_node3/TASK: Enable EPEL 6 [0affe814-3a2d-a75d-4836-0000000000ef] 22286 1726882780.47866: sending task result for task 0affe814-3a2d-a75d-4836-0000000000ef 22286 1726882780.47964: done sending task result for task 0affe814-3a2d-a75d-4836-0000000000ef 22286 1726882780.47967: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 22286 1726882780.48015: no more pending results, returning what we have 22286 1726882780.48018: results queue empty 22286 1726882780.48019: checking for any_errors_fatal 22286 1726882780.48023: done checking for any_errors_fatal 22286 1726882780.48024: checking for max_fail_percentage 22286 1726882780.48026: done checking for max_fail_percentage 22286 1726882780.48027: checking to see if all hosts have failed and the running result is not ok 22286 1726882780.48028: done checking to see if all hosts have failed 22286 1726882780.48029: getting the remaining hosts for this loop 22286 1726882780.48031: done getting the remaining hosts for this loop 22286 1726882780.48036: getting the next task for host managed_node3 22286 1726882780.48044: done getting next task for host managed_node3 22286 1726882780.48047: ^ task is: TASK: Set network provider to 'nm' 22286 1726882780.48049: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882780.48052: getting variables 22286 1726882780.48053: in VariableManager get_vars() 22286 1726882780.48080: Calling all_inventory to load vars for managed_node3 22286 1726882780.48083: Calling groups_inventory to load vars for managed_node3 22286 1726882780.48087: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882780.48097: Calling all_plugins_play to load vars for managed_node3 22286 1726882780.48101: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882780.48103: Calling groups_plugins_play to load vars for managed_node3 22286 1726882780.48413: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882780.48586: done with get_vars() 22286 1726882780.48593: done getting variables 22286 1726882780.48638: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_nm.yml:13 Friday 20 September 2024 21:39:40 -0400 (0:00:00.019) 0:00:03.880 ****** 22286 1726882780.48660: entering _queue_task() for managed_node3/set_fact 22286 1726882780.48846: worker is 1 (out of 1 available) 22286 1726882780.48859: exiting _queue_task() for managed_node3/set_fact 22286 1726882780.48871: done queuing things up, now waiting for results queue to drain 22286 1726882780.48872: waiting for pending results... 22286 1726882780.49031: running TaskExecutor() for managed_node3/TASK: Set network provider to 'nm' 22286 1726882780.49099: in run() - task 0affe814-3a2d-a75d-4836-000000000007 22286 1726882780.49117: variable 'ansible_search_path' from source: unknown 22286 1726882780.49145: calling self._execute() 22286 1726882780.49209: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882780.49215: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882780.49230: variable 'omit' from source: magic vars 22286 1726882780.49321: variable 'omit' from source: magic vars 22286 1726882780.49353: variable 'omit' from source: magic vars 22286 1726882780.49383: variable 'omit' from source: magic vars 22286 1726882780.49419: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22286 1726882780.49459: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22286 1726882780.49478: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22286 1726882780.49495: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882780.49506: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882780.49531: variable 'inventory_hostname' from source: host vars for 'managed_node3' 22286 1726882780.49536: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882780.49541: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882780.49629: Set connection var ansible_shell_executable to /bin/sh 22286 1726882780.49639: Set connection var ansible_module_compression to ZIP_DEFLATED 22286 1726882780.49642: Set connection var ansible_connection to ssh 22286 1726882780.49645: Set connection var ansible_shell_type to sh 22286 1726882780.49653: Set connection var ansible_timeout to 10 22286 1726882780.49662: Set connection var ansible_pipelining to False 22286 1726882780.49687: variable 'ansible_shell_executable' from source: unknown 22286 1726882780.49690: variable 'ansible_connection' from source: unknown 22286 1726882780.49693: variable 'ansible_module_compression' from source: unknown 22286 1726882780.49696: variable 'ansible_shell_type' from source: unknown 22286 1726882780.49701: variable 'ansible_shell_executable' from source: unknown 22286 1726882780.49703: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882780.49709: variable 'ansible_pipelining' from source: unknown 22286 1726882780.49712: variable 'ansible_timeout' from source: unknown 22286 1726882780.49717: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882780.49838: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22286 1726882780.49848: variable 'omit' from source: magic vars 22286 1726882780.49853: starting attempt loop 22286 1726882780.49856: running the handler 22286 1726882780.49867: handler run complete 22286 1726882780.49894: attempt loop complete, returning result 22286 1726882780.49897: _execute() done 22286 1726882780.49900: dumping result to json 22286 1726882780.49902: done dumping result, returning 22286 1726882780.49904: done running TaskExecutor() for managed_node3/TASK: Set network provider to 'nm' [0affe814-3a2d-a75d-4836-000000000007] 22286 1726882780.49907: sending task result for task 0affe814-3a2d-a75d-4836-000000000007 22286 1726882780.49992: done sending task result for task 0affe814-3a2d-a75d-4836-000000000007 22286 1726882780.49996: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 22286 1726882780.50066: no more pending results, returning what we have 22286 1726882780.50069: results queue empty 22286 1726882780.50070: checking for any_errors_fatal 22286 1726882780.50077: done checking for any_errors_fatal 22286 1726882780.50078: checking for max_fail_percentage 22286 1726882780.50080: done checking for max_fail_percentage 22286 1726882780.50081: checking to see if all hosts have failed and the running result is not ok 22286 1726882780.50082: done checking to see if all hosts have failed 22286 1726882780.50083: getting the remaining hosts for this loop 22286 1726882780.50085: done getting the remaining hosts for this loop 22286 1726882780.50089: getting the next task for host managed_node3 22286 1726882780.50095: done getting next task for host managed_node3 22286 1726882780.50097: ^ task is: TASK: meta (flush_handlers) 22286 1726882780.50099: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882780.50103: getting variables 22286 1726882780.50104: in VariableManager get_vars() 22286 1726882780.50130: Calling all_inventory to load vars for managed_node3 22286 1726882780.50133: Calling groups_inventory to load vars for managed_node3 22286 1726882780.50138: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882780.50152: Calling all_plugins_play to load vars for managed_node3 22286 1726882780.50155: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882780.50158: Calling groups_plugins_play to load vars for managed_node3 22286 1726882780.50307: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882780.50502: done with get_vars() 22286 1726882780.50510: done getting variables 22286 1726882780.50562: in VariableManager get_vars() 22286 1726882780.50569: Calling all_inventory to load vars for managed_node3 22286 1726882780.50570: Calling groups_inventory to load vars for managed_node3 22286 1726882780.50572: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882780.50578: Calling all_plugins_play to load vars for managed_node3 22286 1726882780.50580: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882780.50583: Calling groups_plugins_play to load vars for managed_node3 22286 1726882780.50709: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882780.50879: done with get_vars() 22286 1726882780.50890: done queuing things up, now waiting for results queue to drain 22286 1726882780.50891: results queue empty 22286 1726882780.50892: checking for any_errors_fatal 22286 1726882780.50894: done checking for any_errors_fatal 22286 1726882780.50894: checking for max_fail_percentage 22286 1726882780.50895: done checking for max_fail_percentage 22286 1726882780.50896: checking to see if all hosts have failed and the running result is not ok 22286 1726882780.50896: done checking to see if all hosts have failed 22286 1726882780.50897: getting the remaining hosts for this loop 22286 1726882780.50898: done getting the remaining hosts for this loop 22286 1726882780.50899: getting the next task for host managed_node3 22286 1726882780.50902: done getting next task for host managed_node3 22286 1726882780.50904: ^ task is: TASK: meta (flush_handlers) 22286 1726882780.50905: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882780.50912: getting variables 22286 1726882780.50913: in VariableManager get_vars() 22286 1726882780.50921: Calling all_inventory to load vars for managed_node3 22286 1726882780.50923: Calling groups_inventory to load vars for managed_node3 22286 1726882780.50924: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882780.50928: Calling all_plugins_play to load vars for managed_node3 22286 1726882780.50930: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882780.50932: Calling groups_plugins_play to load vars for managed_node3 22286 1726882780.51056: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882780.51244: done with get_vars() 22286 1726882780.51252: done getting variables 22286 1726882780.51289: in VariableManager get_vars() 22286 1726882780.51295: Calling all_inventory to load vars for managed_node3 22286 1726882780.51297: Calling groups_inventory to load vars for managed_node3 22286 1726882780.51299: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882780.51302: Calling all_plugins_play to load vars for managed_node3 22286 1726882780.51304: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882780.51306: Calling groups_plugins_play to load vars for managed_node3 22286 1726882780.51426: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882780.51598: done with get_vars() 22286 1726882780.51608: done queuing things up, now waiting for results queue to drain 22286 1726882780.51609: results queue empty 22286 1726882780.51610: checking for any_errors_fatal 22286 1726882780.51611: done checking for any_errors_fatal 22286 1726882780.51611: checking for max_fail_percentage 22286 1726882780.51612: done checking for max_fail_percentage 22286 1726882780.51613: checking to see if all hosts have failed and the running result is not ok 22286 1726882780.51613: done checking to see if all hosts have failed 22286 1726882780.51614: getting the remaining hosts for this loop 22286 1726882780.51615: done getting the remaining hosts for this loop 22286 1726882780.51616: getting the next task for host managed_node3 22286 1726882780.51618: done getting next task for host managed_node3 22286 1726882780.51619: ^ task is: None 22286 1726882780.51620: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882780.51621: done queuing things up, now waiting for results queue to drain 22286 1726882780.51622: results queue empty 22286 1726882780.51622: checking for any_errors_fatal 22286 1726882780.51623: done checking for any_errors_fatal 22286 1726882780.51623: checking for max_fail_percentage 22286 1726882780.51624: done checking for max_fail_percentage 22286 1726882780.51624: checking to see if all hosts have failed and the running result is not ok 22286 1726882780.51625: done checking to see if all hosts have failed 22286 1726882780.51626: getting the next task for host managed_node3 22286 1726882780.51628: done getting next task for host managed_node3 22286 1726882780.51629: ^ task is: None 22286 1726882780.51630: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882780.51669: in VariableManager get_vars() 22286 1726882780.51694: done with get_vars() 22286 1726882780.51699: in VariableManager get_vars() 22286 1726882780.51710: done with get_vars() 22286 1726882780.51714: variable 'omit' from source: magic vars 22286 1726882780.51740: in VariableManager get_vars() 22286 1726882780.51751: done with get_vars() 22286 1726882780.51767: variable 'omit' from source: magic vars PLAY [Play for testing IPv6 config] ******************************************** 22286 1726882780.52085: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 22286 1726882780.52109: getting the remaining hosts for this loop 22286 1726882780.52110: done getting the remaining hosts for this loop 22286 1726882780.52113: getting the next task for host managed_node3 22286 1726882780.52115: done getting next task for host managed_node3 22286 1726882780.52117: ^ task is: TASK: Gathering Facts 22286 1726882780.52118: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882780.52120: getting variables 22286 1726882780.52121: in VariableManager get_vars() 22286 1726882780.52132: Calling all_inventory to load vars for managed_node3 22286 1726882780.52136: Calling groups_inventory to load vars for managed_node3 22286 1726882780.52138: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882780.52142: Calling all_plugins_play to load vars for managed_node3 22286 1726882780.52152: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882780.52155: Calling groups_plugins_play to load vars for managed_node3 22286 1726882780.52309: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882780.52481: done with get_vars() 22286 1726882780.52489: done getting variables 22286 1726882780.52519: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:3 Friday 20 September 2024 21:39:40 -0400 (0:00:00.038) 0:00:03.918 ****** 22286 1726882780.52538: entering _queue_task() for managed_node3/gather_facts 22286 1726882780.52738: worker is 1 (out of 1 available) 22286 1726882780.52749: exiting _queue_task() for managed_node3/gather_facts 22286 1726882780.52761: done queuing things up, now waiting for results queue to drain 22286 1726882780.52763: waiting for pending results... 22286 1726882780.52919: running TaskExecutor() for managed_node3/TASK: Gathering Facts 22286 1726882780.52984: in run() - task 0affe814-3a2d-a75d-4836-000000000115 22286 1726882780.53000: variable 'ansible_search_path' from source: unknown 22286 1726882780.53031: calling self._execute() 22286 1726882780.53119: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882780.53124: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882780.53126: variable 'omit' from source: magic vars 22286 1726882780.53434: variable 'ansible_distribution_major_version' from source: facts 22286 1726882780.53444: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882780.53454: variable 'omit' from source: magic vars 22286 1726882780.53476: variable 'omit' from source: magic vars 22286 1726882780.53506: variable 'omit' from source: magic vars 22286 1726882780.53539: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22286 1726882780.53576: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22286 1726882780.53595: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22286 1726882780.53611: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882780.53622: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882780.53652: variable 'inventory_hostname' from source: host vars for 'managed_node3' 22286 1726882780.53655: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882780.53658: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882780.53748: Set connection var ansible_shell_executable to /bin/sh 22286 1726882780.53757: Set connection var ansible_module_compression to ZIP_DEFLATED 22286 1726882780.53760: Set connection var ansible_connection to ssh 22286 1726882780.53762: Set connection var ansible_shell_type to sh 22286 1726882780.53771: Set connection var ansible_timeout to 10 22286 1726882780.53783: Set connection var ansible_pipelining to False 22286 1726882780.53805: variable 'ansible_shell_executable' from source: unknown 22286 1726882780.53809: variable 'ansible_connection' from source: unknown 22286 1726882780.53811: variable 'ansible_module_compression' from source: unknown 22286 1726882780.53814: variable 'ansible_shell_type' from source: unknown 22286 1726882780.53818: variable 'ansible_shell_executable' from source: unknown 22286 1726882780.53821: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882780.53827: variable 'ansible_pipelining' from source: unknown 22286 1726882780.53829: variable 'ansible_timeout' from source: unknown 22286 1726882780.53835: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882780.53986: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22286 1726882780.53997: variable 'omit' from source: magic vars 22286 1726882780.54010: starting attempt loop 22286 1726882780.54013: running the handler 22286 1726882780.54026: variable 'ansible_facts' from source: unknown 22286 1726882780.54044: _low_level_execute_command(): starting 22286 1726882780.54052: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22286 1726882780.54605: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882780.54609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882780.54612: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found <<< 22286 1726882780.54617: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882780.54674: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882780.54686: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882780.54689: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882780.54817: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882780.57336: stdout chunk (state=3): >>>/root <<< 22286 1726882780.57500: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882780.57551: stderr chunk (state=3): >>><<< 22286 1726882780.57554: stdout chunk (state=3): >>><<< 22286 1726882780.57577: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882780.57586: _low_level_execute_command(): starting 22286 1726882780.57592: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882780.5757446-22439-38284180306904 `" && echo ansible-tmp-1726882780.5757446-22439-38284180306904="` echo /root/.ansible/tmp/ansible-tmp-1726882780.5757446-22439-38284180306904 `" ) && sleep 0' 22286 1726882780.58050: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882780.58053: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882780.58057: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882780.58065: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882780.58117: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882780.58121: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882780.58250: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882780.61165: stdout chunk (state=3): >>>ansible-tmp-1726882780.5757446-22439-38284180306904=/root/.ansible/tmp/ansible-tmp-1726882780.5757446-22439-38284180306904 <<< 22286 1726882780.61356: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882780.61404: stderr chunk (state=3): >>><<< 22286 1726882780.61409: stdout chunk (state=3): >>><<< 22286 1726882780.61424: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882780.5757446-22439-38284180306904=/root/.ansible/tmp/ansible-tmp-1726882780.5757446-22439-38284180306904 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882780.61450: variable 'ansible_module_compression' from source: unknown 22286 1726882780.61491: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22286nvkga28q/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 22286 1726882780.61546: variable 'ansible_facts' from source: unknown 22286 1726882780.61667: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882780.5757446-22439-38284180306904/AnsiballZ_setup.py 22286 1726882780.61786: Sending initial data 22286 1726882780.61789: Sent initial data (153 bytes) 22286 1726882780.62239: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882780.62242: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found <<< 22286 1726882780.62245: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration <<< 22286 1726882780.62247: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found <<< 22286 1726882780.62250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882780.62310: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882780.62313: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882780.62425: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882780.64805: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 <<< 22286 1726882780.64808: stderr chunk (state=3): >>>debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22286 1726882780.64920: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22286 1726882780.65043: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22286nvkga28q/tmp1tgdl_sl /root/.ansible/tmp/ansible-tmp-1726882780.5757446-22439-38284180306904/AnsiballZ_setup.py <<< 22286 1726882780.65048: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882780.5757446-22439-38284180306904/AnsiballZ_setup.py" <<< 22286 1726882780.65160: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-22286nvkga28q/tmp1tgdl_sl" to remote "/root/.ansible/tmp/ansible-tmp-1726882780.5757446-22439-38284180306904/AnsiballZ_setup.py" <<< 22286 1726882780.65166: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882780.5757446-22439-38284180306904/AnsiballZ_setup.py" <<< 22286 1726882780.67275: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882780.67339: stderr chunk (state=3): >>><<< 22286 1726882780.67342: stdout chunk (state=3): >>><<< 22286 1726882780.67364: done transferring module to remote 22286 1726882780.67377: _low_level_execute_command(): starting 22286 1726882780.67384: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882780.5757446-22439-38284180306904/ /root/.ansible/tmp/ansible-tmp-1726882780.5757446-22439-38284180306904/AnsiballZ_setup.py && sleep 0' 22286 1726882780.67848: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882780.67851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882780.67854: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882780.67856: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882780.67906: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882780.67916: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882780.68031: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882780.70799: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882780.70844: stderr chunk (state=3): >>><<< 22286 1726882780.70849: stdout chunk (state=3): >>><<< 22286 1726882780.70870: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882780.70874: _low_level_execute_command(): starting 22286 1726882780.70879: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882780.5757446-22439-38284180306904/AnsiballZ_setup.py && sleep 0' 22286 1726882780.71324: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882780.71328: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882780.71330: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882780.71333: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882780.71383: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882780.71387: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882780.71515: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882781.64618: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_loadavg": {"1m": 0.728515625, "5m": 0.48388671875, "15m": 0.2822265625}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_system": "Linux", "ansible_kernel": "6.10.9-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 9 02:28:01 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-41-238.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-41-238", "ansible_nodename": "ip-10-31-41-238.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec243b3f63949f99a85dd461938b27f6", "ansible_fibre_channel_wwn": [], "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANYijgij1fEhTOf5yay/qzv1+ckF/sTeAcrQU7mSl4JlHlSjJFRS9ZOEcTyZhIM24bmrUmUAXGByisr1fJhHdM1w6H4TV9d8eAGz5dvqRt3OMFXU98TIudAuK2zln4nrfCSz2a6X/3opBJuckX9rZaO0ickijdGATG1zU5j6yse5AAAAFQCB9h5S0fKeFdZzGlNOZp7suEtGLQAAAIBqL8YusJlS5M+t8hqB5XoiVX2JRwxeO45o1F+YDEL8s88gEWv3QxNNB5xqhdMrEbA13n8FJWfZZdvcU7PONunHJRbKJZFHcCdK5TI9eGObNVaZTYNSFhZ2BieAeUf4m7eiHxQI/o71WHee8ehKt8oSXovKXzKzzh6V8adityCM0QAAAIBN41A6QjMEnqm1991CkEko30YVBdWgdcunoDD7NJJoONNTR054WsZIbydxWyYVFa3fC4HcmSYJjXHxuSZuCmzFZYOzQSedVWWiET/kLEvIDxOZEQ44DCsa3zg29Ty97IbNNwLSIOFXoUbWllCQV9qge0q5dQ/J1wTQdymso3DyLQ==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC9Kn//qIm4a6eEMMZR4qpdJnSJr3FLIr3UgGTmVZPamYNdQ29vbdZONWFxDxoVqR69dvgMi7B4CVdaUu6QyBPlMI1hnnTMD7yGuFvf0wDLvk2p5tQf1MwOc8WdJCPqkvcCYTOD4gBf/qOT2LG96u3e6y9NpDSs43WwFzV2YMOpEVnbg+17SPjuOyE07jTJi4gLXbcjXxt9rz8nQMlsQPFysakPATk6pjVZnnTWDcFUSfc1sUdO6IWl/O4jlB/QtP/FkO38YQbSYx1fiZNsk+JP6ZeZ4F0trwlxRemd9P6eEqtA9jVdvSCvJNHgZoMob64uw1c2P8BFaAByky5crE35ggw6pKcQTHTAHhrPBTx12gpwlL4rB+OoysKYhxI8VeW+TYiNWBxF+EUpmcj/QMfOOgNbIEeK+YfNZ606vwhkyjORVqaN3MswYozhtwmAoyxDKaTAYWXo4+d+GqZ7pURKpwdZrI8M7e8Nvd+dwpW3OtrfAqXvFwIrBrivFfWnDE0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBsLvf+TKoJqIfm5H0y7RP9w7PN99SnDATkd0bkTPwuIbQqBA6MAihYQaVCQtnKQCWC09GNZyMQSeayjLONajkY=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIME0e8Dw9KPaZbGCYYNAh3+j3dHxYuGhpELosAmEvhOR", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_lsb": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fips": false, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "39", "second": "41", "epoch": "1726882781", "epoch_int": "1726882781", "date": "2024-09-20", "time": "21:39:41", "iso8601_micro": "2024-09-21T01:39:41.223129Z", "iso8601": "2024-09-21T01:39:41Z", "iso8601_basic": "20240920T213941223129", "iso8601_basic_short": "20240920T213941", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_distribution": "Fedora<<< 22286 1726882781.64657: stdout chunk (state=3): >>>", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.145 37172 10.31.41.238 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.145 37172 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_local": {}, "ansible_is_chroot": false, "ansible_hostnqn": "", "ansible_pkg_mgr": "dnf", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2867, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 850, "free": 2867}, "nocache": {"free": 3454, "used": 263}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec243b3f-6394-9f99-a85d-d461938b27f6", "ansible_product_uuid": "ec243b3f-6394-9f99-a85d-d461938b27f6", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["f92a5a40-e33d-4a6f-8746-997eff27cfbd"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "f92a5a40-e33d-4a6f-8746-997eff27cfbd", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["f92a5a40-e33d-4a6f-8746-997eff27cfbd"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 925, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251211792384, "block_size": 4096, "block_total": 64483404, "block_available": 61331004, "block_used": 3152400, "inode_total": 16384000, "inode_available": 16303844, "inode_used": 80156, "uuid": "f92a5a40-e33d-4a6f-8746-997eff27cfbd"}], "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0e:39:03:af:ed:a3", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.41.238", "broadcast": "10.31.43.255", "netmask": "255.255.252.0", "network": "10.31.40.0", "prefix": "22"}, "ipv6": [{"address": "fe80::a0b7:fdc4:48e8:7158", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.40.1", "interface": "eth0", "address": "10.31.41.238", "broadcast": "10.31.43.255", "netmask": "255.255.252.0", "network": "10.31.40.0", "prefix": "22", "macaddress": "0e:39:03:af:ed:a3", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.41.238"], "ansible_all_ipv6_addresses": ["fe80::a0b7:fdc4:48e8:7158"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.41.238", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::a0b7:fdc4:48e8:7158"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_iscsi_iqn": "", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 22286 1726882781.67717: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882781.67759: stderr chunk (state=3): >>>Shared connection to 10.31.41.238 closed. <<< 22286 1726882781.67763: stdout chunk (state=3): >>><<< 22286 1726882781.67765: stderr chunk (state=3): >>><<< 22286 1726882781.67806: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_loadavg": {"1m": 0.728515625, "5m": 0.48388671875, "15m": 0.2822265625}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_system": "Linux", "ansible_kernel": "6.10.9-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 9 02:28:01 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-41-238.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-41-238", "ansible_nodename": "ip-10-31-41-238.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec243b3f63949f99a85dd461938b27f6", "ansible_fibre_channel_wwn": [], "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANYijgij1fEhTOf5yay/qzv1+ckF/sTeAcrQU7mSl4JlHlSjJFRS9ZOEcTyZhIM24bmrUmUAXGByisr1fJhHdM1w6H4TV9d8eAGz5dvqRt3OMFXU98TIudAuK2zln4nrfCSz2a6X/3opBJuckX9rZaO0ickijdGATG1zU5j6yse5AAAAFQCB9h5S0fKeFdZzGlNOZp7suEtGLQAAAIBqL8YusJlS5M+t8hqB5XoiVX2JRwxeO45o1F+YDEL8s88gEWv3QxNNB5xqhdMrEbA13n8FJWfZZdvcU7PONunHJRbKJZFHcCdK5TI9eGObNVaZTYNSFhZ2BieAeUf4m7eiHxQI/o71WHee8ehKt8oSXovKXzKzzh6V8adityCM0QAAAIBN41A6QjMEnqm1991CkEko30YVBdWgdcunoDD7NJJoONNTR054WsZIbydxWyYVFa3fC4HcmSYJjXHxuSZuCmzFZYOzQSedVWWiET/kLEvIDxOZEQ44DCsa3zg29Ty97IbNNwLSIOFXoUbWllCQV9qge0q5dQ/J1wTQdymso3DyLQ==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC9Kn//qIm4a6eEMMZR4qpdJnSJr3FLIr3UgGTmVZPamYNdQ29vbdZONWFxDxoVqR69dvgMi7B4CVdaUu6QyBPlMI1hnnTMD7yGuFvf0wDLvk2p5tQf1MwOc8WdJCPqkvcCYTOD4gBf/qOT2LG96u3e6y9NpDSs43WwFzV2YMOpEVnbg+17SPjuOyE07jTJi4gLXbcjXxt9rz8nQMlsQPFysakPATk6pjVZnnTWDcFUSfc1sUdO6IWl/O4jlB/QtP/FkO38YQbSYx1fiZNsk+JP6ZeZ4F0trwlxRemd9P6eEqtA9jVdvSCvJNHgZoMob64uw1c2P8BFaAByky5crE35ggw6pKcQTHTAHhrPBTx12gpwlL4rB+OoysKYhxI8VeW+TYiNWBxF+EUpmcj/QMfOOgNbIEeK+YfNZ606vwhkyjORVqaN3MswYozhtwmAoyxDKaTAYWXo4+d+GqZ7pURKpwdZrI8M7e8Nvd+dwpW3OtrfAqXvFwIrBrivFfWnDE0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBsLvf+TKoJqIfm5H0y7RP9w7PN99SnDATkd0bkTPwuIbQqBA6MAihYQaVCQtnKQCWC09GNZyMQSeayjLONajkY=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIME0e8Dw9KPaZbGCYYNAh3+j3dHxYuGhpELosAmEvhOR", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_lsb": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fips": false, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "39", "second": "41", "epoch": "1726882781", "epoch_int": "1726882781", "date": "2024-09-20", "time": "21:39:41", "iso8601_micro": "2024-09-21T01:39:41.223129Z", "iso8601": "2024-09-21T01:39:41Z", "iso8601_basic": "20240920T213941223129", "iso8601_basic_short": "20240920T213941", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.145 37172 10.31.41.238 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.145 37172 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_local": {}, "ansible_is_chroot": false, "ansible_hostnqn": "", "ansible_pkg_mgr": "dnf", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2867, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 850, "free": 2867}, "nocache": {"free": 3454, "used": 263}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec243b3f-6394-9f99-a85d-d461938b27f6", "ansible_product_uuid": "ec243b3f-6394-9f99-a85d-d461938b27f6", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["f92a5a40-e33d-4a6f-8746-997eff27cfbd"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "f92a5a40-e33d-4a6f-8746-997eff27cfbd", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["f92a5a40-e33d-4a6f-8746-997eff27cfbd"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 925, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251211792384, "block_size": 4096, "block_total": 64483404, "block_available": 61331004, "block_used": 3152400, "inode_total": 16384000, "inode_available": 16303844, "inode_used": 80156, "uuid": "f92a5a40-e33d-4a6f-8746-997eff27cfbd"}], "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0e:39:03:af:ed:a3", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.41.238", "broadcast": "10.31.43.255", "netmask": "255.255.252.0", "network": "10.31.40.0", "prefix": "22"}, "ipv6": [{"address": "fe80::a0b7:fdc4:48e8:7158", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.40.1", "interface": "eth0", "address": "10.31.41.238", "broadcast": "10.31.43.255", "netmask": "255.255.252.0", "network": "10.31.40.0", "prefix": "22", "macaddress": "0e:39:03:af:ed:a3", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.41.238"], "ansible_all_ipv6_addresses": ["fe80::a0b7:fdc4:48e8:7158"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.41.238", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::a0b7:fdc4:48e8:7158"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_iscsi_iqn": "", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. 22286 1726882781.68473: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882780.5757446-22439-38284180306904/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22286 1726882781.68476: _low_level_execute_command(): starting 22286 1726882781.68479: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882780.5757446-22439-38284180306904/ > /dev/null 2>&1 && sleep 0' 22286 1726882781.69104: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882781.69128: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882781.69148: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882781.69249: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882781.69285: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882781.69302: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882781.69324: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882781.69488: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882781.72452: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882781.72533: stderr chunk (state=3): >>><<< 22286 1726882781.72555: stdout chunk (state=3): >>><<< 22286 1726882781.72581: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882781.72640: handler run complete 22286 1726882781.72849: variable 'ansible_facts' from source: unknown 22286 1726882781.73025: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882781.73613: variable 'ansible_facts' from source: unknown 22286 1726882781.74080: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882781.74346: attempt loop complete, returning result 22286 1726882781.74416: _execute() done 22286 1726882781.74426: dumping result to json 22286 1726882781.74475: done dumping result, returning 22286 1726882781.74510: done running TaskExecutor() for managed_node3/TASK: Gathering Facts [0affe814-3a2d-a75d-4836-000000000115] 22286 1726882781.74523: sending task result for task 0affe814-3a2d-a75d-4836-000000000115 ok: [managed_node3] 22286 1726882781.76020: done sending task result for task 0affe814-3a2d-a75d-4836-000000000115 22286 1726882781.76023: WORKER PROCESS EXITING 22286 1726882781.76461: no more pending results, returning what we have 22286 1726882781.76464: results queue empty 22286 1726882781.76465: checking for any_errors_fatal 22286 1726882781.76467: done checking for any_errors_fatal 22286 1726882781.76468: checking for max_fail_percentage 22286 1726882781.76470: done checking for max_fail_percentage 22286 1726882781.76471: checking to see if all hosts have failed and the running result is not ok 22286 1726882781.76472: done checking to see if all hosts have failed 22286 1726882781.76473: getting the remaining hosts for this loop 22286 1726882781.76475: done getting the remaining hosts for this loop 22286 1726882781.76481: getting the next task for host managed_node3 22286 1726882781.76488: done getting next task for host managed_node3 22286 1726882781.76490: ^ task is: TASK: meta (flush_handlers) 22286 1726882781.76492: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882781.76497: getting variables 22286 1726882781.76498: in VariableManager get_vars() 22286 1726882781.76533: Calling all_inventory to load vars for managed_node3 22286 1726882781.76627: Calling groups_inventory to load vars for managed_node3 22286 1726882781.76631: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882781.76646: Calling all_plugins_play to load vars for managed_node3 22286 1726882781.76671: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882781.76707: Calling groups_plugins_play to load vars for managed_node3 22286 1726882781.77191: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882781.78099: done with get_vars() 22286 1726882781.78156: done getting variables 22286 1726882781.78409: in VariableManager get_vars() 22286 1726882781.78426: Calling all_inventory to load vars for managed_node3 22286 1726882781.78493: Calling groups_inventory to load vars for managed_node3 22286 1726882781.78497: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882781.78502: Calling all_plugins_play to load vars for managed_node3 22286 1726882781.78506: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882781.78510: Calling groups_plugins_play to load vars for managed_node3 22286 1726882781.79549: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882781.79839: done with get_vars() 22286 1726882781.79855: done queuing things up, now waiting for results queue to drain 22286 1726882781.79858: results queue empty 22286 1726882781.79859: checking for any_errors_fatal 22286 1726882781.79863: done checking for any_errors_fatal 22286 1726882781.79864: checking for max_fail_percentage 22286 1726882781.79866: done checking for max_fail_percentage 22286 1726882781.79870: checking to see if all hosts have failed and the running result is not ok 22286 1726882781.79871: done checking to see if all hosts have failed 22286 1726882781.79872: getting the remaining hosts for this loop 22286 1726882781.79873: done getting the remaining hosts for this loop 22286 1726882781.79876: getting the next task for host managed_node3 22286 1726882781.79881: done getting next task for host managed_node3 22286 1726882781.79884: ^ task is: TASK: Include the task 'show_interfaces.yml' 22286 1726882781.79886: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882781.79889: getting variables 22286 1726882781.79890: in VariableManager get_vars() 22286 1726882781.79905: Calling all_inventory to load vars for managed_node3 22286 1726882781.79908: Calling groups_inventory to load vars for managed_node3 22286 1726882781.79910: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882781.79915: Calling all_plugins_play to load vars for managed_node3 22286 1726882781.79918: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882781.79921: Calling groups_plugins_play to load vars for managed_node3 22286 1726882781.80119: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882781.80417: done with get_vars() 22286 1726882781.80428: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:9 Friday 20 September 2024 21:39:41 -0400 (0:00:01.279) 0:00:05.198 ****** 22286 1726882781.80510: entering _queue_task() for managed_node3/include_tasks 22286 1726882781.80810: worker is 1 (out of 1 available) 22286 1726882781.80824: exiting _queue_task() for managed_node3/include_tasks 22286 1726882781.81039: done queuing things up, now waiting for results queue to drain 22286 1726882781.81042: waiting for pending results... 22286 1726882781.81104: running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' 22286 1726882781.81225: in run() - task 0affe814-3a2d-a75d-4836-00000000000b 22286 1726882781.81255: variable 'ansible_search_path' from source: unknown 22286 1726882781.81307: calling self._execute() 22286 1726882781.81406: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882781.81419: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882781.81485: variable 'omit' from source: magic vars 22286 1726882781.82079: variable 'ansible_distribution_major_version' from source: facts 22286 1726882781.82100: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882781.82111: _execute() done 22286 1726882781.82119: dumping result to json 22286 1726882781.82128: done dumping result, returning 22286 1726882781.82195: done running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' [0affe814-3a2d-a75d-4836-00000000000b] 22286 1726882781.82206: sending task result for task 0affe814-3a2d-a75d-4836-00000000000b 22286 1726882781.82540: done sending task result for task 0affe814-3a2d-a75d-4836-00000000000b 22286 1726882781.82543: WORKER PROCESS EXITING 22286 1726882781.82575: no more pending results, returning what we have 22286 1726882781.82580: in VariableManager get_vars() 22286 1726882781.82631: Calling all_inventory to load vars for managed_node3 22286 1726882781.82639: Calling groups_inventory to load vars for managed_node3 22286 1726882781.82642: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882781.82659: Calling all_plugins_play to load vars for managed_node3 22286 1726882781.82662: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882781.82666: Calling groups_plugins_play to load vars for managed_node3 22286 1726882781.83475: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882781.83946: done with get_vars() 22286 1726882781.83954: variable 'ansible_search_path' from source: unknown 22286 1726882781.83968: we have included files to process 22286 1726882781.83969: generating all_blocks data 22286 1726882781.83971: done generating all_blocks data 22286 1726882781.83972: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 22286 1726882781.83973: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 22286 1726882781.83976: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 22286 1726882781.84172: in VariableManager get_vars() 22286 1726882781.84198: done with get_vars() 22286 1726882781.84329: done processing included file 22286 1726882781.84331: iterating over new_blocks loaded from include file 22286 1726882781.84335: in VariableManager get_vars() 22286 1726882781.84356: done with get_vars() 22286 1726882781.84359: filtering new block on tags 22286 1726882781.84380: done filtering new block on tags 22286 1726882781.84383: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node3 22286 1726882781.84388: extending task lists for all hosts with included blocks 22286 1726882781.84478: done extending task lists 22286 1726882781.84480: done processing included files 22286 1726882781.84481: results queue empty 22286 1726882781.84482: checking for any_errors_fatal 22286 1726882781.84485: done checking for any_errors_fatal 22286 1726882781.84486: checking for max_fail_percentage 22286 1726882781.84487: done checking for max_fail_percentage 22286 1726882781.84488: checking to see if all hosts have failed and the running result is not ok 22286 1726882781.84489: done checking to see if all hosts have failed 22286 1726882781.84490: getting the remaining hosts for this loop 22286 1726882781.84492: done getting the remaining hosts for this loop 22286 1726882781.84495: getting the next task for host managed_node3 22286 1726882781.84499: done getting next task for host managed_node3 22286 1726882781.84501: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 22286 1726882781.84505: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882781.84507: getting variables 22286 1726882781.84509: in VariableManager get_vars() 22286 1726882781.84524: Calling all_inventory to load vars for managed_node3 22286 1726882781.84527: Calling groups_inventory to load vars for managed_node3 22286 1726882781.84530: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882781.84537: Calling all_plugins_play to load vars for managed_node3 22286 1726882781.84540: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882781.84545: Calling groups_plugins_play to load vars for managed_node3 22286 1726882781.84784: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882781.85089: done with get_vars() 22286 1726882781.85101: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 21:39:41 -0400 (0:00:00.046) 0:00:05.245 ****** 22286 1726882781.85186: entering _queue_task() for managed_node3/include_tasks 22286 1726882781.85462: worker is 1 (out of 1 available) 22286 1726882781.85473: exiting _queue_task() for managed_node3/include_tasks 22286 1726882781.85488: done queuing things up, now waiting for results queue to drain 22286 1726882781.85489: waiting for pending results... 22286 1726882781.85770: running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' 22286 1726882781.85915: in run() - task 0affe814-3a2d-a75d-4836-00000000012b 22286 1726882781.85940: variable 'ansible_search_path' from source: unknown 22286 1726882781.85955: variable 'ansible_search_path' from source: unknown 22286 1726882781.86020: calling self._execute() 22286 1726882781.86124: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882781.86160: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882781.86339: variable 'omit' from source: magic vars 22286 1726882781.86684: variable 'ansible_distribution_major_version' from source: facts 22286 1726882781.86720: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882781.86732: _execute() done 22286 1726882781.86743: dumping result to json 22286 1726882781.86752: done dumping result, returning 22286 1726882781.86762: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' [0affe814-3a2d-a75d-4836-00000000012b] 22286 1726882781.86772: sending task result for task 0affe814-3a2d-a75d-4836-00000000012b 22286 1726882781.87141: done sending task result for task 0affe814-3a2d-a75d-4836-00000000012b 22286 1726882781.87145: WORKER PROCESS EXITING 22286 1726882781.87173: no more pending results, returning what we have 22286 1726882781.87181: in VariableManager get_vars() 22286 1726882781.87229: Calling all_inventory to load vars for managed_node3 22286 1726882781.87232: Calling groups_inventory to load vars for managed_node3 22286 1726882781.87238: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882781.87254: Calling all_plugins_play to load vars for managed_node3 22286 1726882781.87258: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882781.87262: Calling groups_plugins_play to load vars for managed_node3 22286 1726882781.88051: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882781.88525: done with get_vars() 22286 1726882781.88667: variable 'ansible_search_path' from source: unknown 22286 1726882781.88670: variable 'ansible_search_path' from source: unknown 22286 1726882781.88715: we have included files to process 22286 1726882781.88717: generating all_blocks data 22286 1726882781.88719: done generating all_blocks data 22286 1726882781.88720: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 22286 1726882781.88721: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 22286 1726882781.88724: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 22286 1726882781.89142: done processing included file 22286 1726882781.89144: iterating over new_blocks loaded from include file 22286 1726882781.89146: in VariableManager get_vars() 22286 1726882781.89168: done with get_vars() 22286 1726882781.89170: filtering new block on tags 22286 1726882781.89193: done filtering new block on tags 22286 1726882781.89196: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node3 22286 1726882781.89201: extending task lists for all hosts with included blocks 22286 1726882781.89336: done extending task lists 22286 1726882781.89337: done processing included files 22286 1726882781.89338: results queue empty 22286 1726882781.89339: checking for any_errors_fatal 22286 1726882781.89342: done checking for any_errors_fatal 22286 1726882781.89344: checking for max_fail_percentage 22286 1726882781.89345: done checking for max_fail_percentage 22286 1726882781.89346: checking to see if all hosts have failed and the running result is not ok 22286 1726882781.89347: done checking to see if all hosts have failed 22286 1726882781.89348: getting the remaining hosts for this loop 22286 1726882781.89349: done getting the remaining hosts for this loop 22286 1726882781.89352: getting the next task for host managed_node3 22286 1726882781.89357: done getting next task for host managed_node3 22286 1726882781.89360: ^ task is: TASK: Gather current interface info 22286 1726882781.89363: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882781.89366: getting variables 22286 1726882781.89367: in VariableManager get_vars() 22286 1726882781.89384: Calling all_inventory to load vars for managed_node3 22286 1726882781.89387: Calling groups_inventory to load vars for managed_node3 22286 1726882781.89390: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882781.89395: Calling all_plugins_play to load vars for managed_node3 22286 1726882781.89398: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882781.89402: Calling groups_plugins_play to load vars for managed_node3 22286 1726882781.89654: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882781.89973: done with get_vars() 22286 1726882781.89987: done getting variables 22286 1726882781.90032: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 21:39:41 -0400 (0:00:00.048) 0:00:05.294 ****** 22286 1726882781.90065: entering _queue_task() for managed_node3/command 22286 1726882781.90321: worker is 1 (out of 1 available) 22286 1726882781.90332: exiting _queue_task() for managed_node3/command 22286 1726882781.90547: done queuing things up, now waiting for results queue to drain 22286 1726882781.90549: waiting for pending results... 22286 1726882781.90609: running TaskExecutor() for managed_node3/TASK: Gather current interface info 22286 1726882781.90744: in run() - task 0affe814-3a2d-a75d-4836-00000000013a 22286 1726882781.90764: variable 'ansible_search_path' from source: unknown 22286 1726882781.90781: variable 'ansible_search_path' from source: unknown 22286 1726882781.91088: calling self._execute() 22286 1726882781.91453: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882781.91456: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882781.91460: variable 'omit' from source: magic vars 22286 1726882781.92074: variable 'ansible_distribution_major_version' from source: facts 22286 1726882781.92127: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882781.92439: variable 'omit' from source: magic vars 22286 1726882781.92445: variable 'omit' from source: magic vars 22286 1726882781.92615: variable 'omit' from source: magic vars 22286 1726882781.92672: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22286 1726882781.92762: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22286 1726882781.92858: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22286 1726882781.92890: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882781.92957: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882781.93000: variable 'inventory_hostname' from source: host vars for 'managed_node3' 22286 1726882781.93055: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882781.93066: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882781.93451: Set connection var ansible_shell_executable to /bin/sh 22286 1726882781.93454: Set connection var ansible_module_compression to ZIP_DEFLATED 22286 1726882781.93456: Set connection var ansible_connection to ssh 22286 1726882781.93458: Set connection var ansible_shell_type to sh 22286 1726882781.93460: Set connection var ansible_timeout to 10 22286 1726882781.93462: Set connection var ansible_pipelining to False 22286 1726882781.93464: variable 'ansible_shell_executable' from source: unknown 22286 1726882781.93465: variable 'ansible_connection' from source: unknown 22286 1726882781.93468: variable 'ansible_module_compression' from source: unknown 22286 1726882781.93469: variable 'ansible_shell_type' from source: unknown 22286 1726882781.93471: variable 'ansible_shell_executable' from source: unknown 22286 1726882781.93473: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882781.93475: variable 'ansible_pipelining' from source: unknown 22286 1726882781.93479: variable 'ansible_timeout' from source: unknown 22286 1726882781.93569: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882781.93839: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22286 1726882781.94106: variable 'omit' from source: magic vars 22286 1726882781.94109: starting attempt loop 22286 1726882781.94112: running the handler 22286 1726882781.94115: _low_level_execute_command(): starting 22286 1726882781.94117: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22286 1726882781.95463: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882781.95872: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 22286 1726882781.95878: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882781.95957: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882781.96110: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882781.97967: stdout chunk (state=3): >>>/root <<< 22286 1726882781.98079: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882781.98253: stderr chunk (state=3): >>><<< 22286 1726882781.98263: stdout chunk (state=3): >>><<< 22286 1726882781.98299: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882781.98320: _low_level_execute_command(): starting 22286 1726882781.98363: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882781.9830692-22487-158138142444748 `" && echo ansible-tmp-1726882781.9830692-22487-158138142444748="` echo /root/.ansible/tmp/ansible-tmp-1726882781.9830692-22487-158138142444748 `" ) && sleep 0' 22286 1726882781.99729: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 22286 1726882781.99781: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882781.99797: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882781.99951: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882782.02071: stdout chunk (state=3): >>>ansible-tmp-1726882781.9830692-22487-158138142444748=/root/.ansible/tmp/ansible-tmp-1726882781.9830692-22487-158138142444748 <<< 22286 1726882782.02245: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882782.02355: stderr chunk (state=3): >>><<< 22286 1726882782.02366: stdout chunk (state=3): >>><<< 22286 1726882782.02394: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882781.9830692-22487-158138142444748=/root/.ansible/tmp/ansible-tmp-1726882781.9830692-22487-158138142444748 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882782.02438: variable 'ansible_module_compression' from source: unknown 22286 1726882782.02623: ANSIBALLZ: Using generic lock for ansible.legacy.command 22286 1726882782.02632: ANSIBALLZ: Acquiring lock 22286 1726882782.02644: ANSIBALLZ: Lock acquired: 140212085117232 22286 1726882782.02653: ANSIBALLZ: Creating module 22286 1726882782.34362: ANSIBALLZ: Writing module into payload 22286 1726882782.34620: ANSIBALLZ: Writing module 22286 1726882782.35042: ANSIBALLZ: Renaming module 22286 1726882782.35045: ANSIBALLZ: Done creating module 22286 1726882782.35047: variable 'ansible_facts' from source: unknown 22286 1726882782.35261: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882781.9830692-22487-158138142444748/AnsiballZ_command.py 22286 1726882782.35611: Sending initial data 22286 1726882782.35615: Sent initial data (156 bytes) 22286 1726882782.37012: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882782.37028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882782.37124: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882782.37202: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882782.37351: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882782.37491: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882782.39342: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22286 1726882782.39478: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22286 1726882782.39641: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22286nvkga28q/tmpo6of2jtl /root/.ansible/tmp/ansible-tmp-1726882781.9830692-22487-158138142444748/AnsiballZ_command.py <<< 22286 1726882782.39652: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882781.9830692-22487-158138142444748/AnsiballZ_command.py" <<< 22286 1726882782.39756: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-22286nvkga28q/tmpo6of2jtl" to remote "/root/.ansible/tmp/ansible-tmp-1726882781.9830692-22487-158138142444748/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882781.9830692-22487-158138142444748/AnsiballZ_command.py" <<< 22286 1726882782.41974: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882782.42074: stderr chunk (state=3): >>><<< 22286 1726882782.42088: stdout chunk (state=3): >>><<< 22286 1726882782.42118: done transferring module to remote 22286 1726882782.42260: _low_level_execute_command(): starting 22286 1726882782.42264: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882781.9830692-22487-158138142444748/ /root/.ansible/tmp/ansible-tmp-1726882781.9830692-22487-158138142444748/AnsiballZ_command.py && sleep 0' 22286 1726882782.43293: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882782.43297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found <<< 22286 1726882782.43300: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address <<< 22286 1726882782.43302: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found <<< 22286 1726882782.43304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882782.43658: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882782.43807: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882782.45862: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882782.46142: stderr chunk (state=3): >>><<< 22286 1726882782.46148: stdout chunk (state=3): >>><<< 22286 1726882782.46151: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882782.46153: _low_level_execute_command(): starting 22286 1726882782.46156: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882781.9830692-22487-158138142444748/AnsiballZ_command.py && sleep 0' 22286 1726882782.47189: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882782.47350: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882782.47455: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882782.47643: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882782.47764: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882782.65411: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:39:42.648582", "end": "2024-09-20 21:39:42.652131", "delta": "0:00:00.003549", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 22286 1726882782.67266: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. <<< 22286 1726882782.67270: stdout chunk (state=3): >>><<< 22286 1726882782.67272: stderr chunk (state=3): >>><<< 22286 1726882782.67293: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:39:42.648582", "end": "2024-09-20 21:39:42.652131", "delta": "0:00:00.003549", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. 22286 1726882782.67558: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882781.9830692-22487-158138142444748/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22286 1726882782.67567: _low_level_execute_command(): starting 22286 1726882782.67640: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882781.9830692-22487-158138142444748/ > /dev/null 2>&1 && sleep 0' 22286 1726882782.68739: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882782.68743: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882782.68745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882782.68748: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22286 1726882782.68750: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882782.68794: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882782.68808: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882782.68823: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882782.68974: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882782.71140: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882782.71144: stdout chunk (state=3): >>><<< 22286 1726882782.71146: stderr chunk (state=3): >>><<< 22286 1726882782.71149: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882782.71152: handler run complete 22286 1726882782.71154: Evaluated conditional (False): False 22286 1726882782.71156: attempt loop complete, returning result 22286 1726882782.71158: _execute() done 22286 1726882782.71160: dumping result to json 22286 1726882782.71162: done dumping result, returning 22286 1726882782.71165: done running TaskExecutor() for managed_node3/TASK: Gather current interface info [0affe814-3a2d-a75d-4836-00000000013a] 22286 1726882782.71170: sending task result for task 0affe814-3a2d-a75d-4836-00000000013a 22286 1726882782.71318: done sending task result for task 0affe814-3a2d-a75d-4836-00000000013a 22286 1726882782.71321: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003549", "end": "2024-09-20 21:39:42.652131", "rc": 0, "start": "2024-09-20 21:39:42.648582" } STDOUT: bonding_masters eth0 lo 22286 1726882782.71577: no more pending results, returning what we have 22286 1726882782.71581: results queue empty 22286 1726882782.71582: checking for any_errors_fatal 22286 1726882782.71584: done checking for any_errors_fatal 22286 1726882782.71585: checking for max_fail_percentage 22286 1726882782.71587: done checking for max_fail_percentage 22286 1726882782.71588: checking to see if all hosts have failed and the running result is not ok 22286 1726882782.71590: done checking to see if all hosts have failed 22286 1726882782.71591: getting the remaining hosts for this loop 22286 1726882782.71593: done getting the remaining hosts for this loop 22286 1726882782.71598: getting the next task for host managed_node3 22286 1726882782.71605: done getting next task for host managed_node3 22286 1726882782.71609: ^ task is: TASK: Set current_interfaces 22286 1726882782.71614: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882782.71618: getting variables 22286 1726882782.71620: in VariableManager get_vars() 22286 1726882782.72033: Calling all_inventory to load vars for managed_node3 22286 1726882782.72040: Calling groups_inventory to load vars for managed_node3 22286 1726882782.72044: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882782.72059: Calling all_plugins_play to load vars for managed_node3 22286 1726882782.72062: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882782.72067: Calling groups_plugins_play to load vars for managed_node3 22286 1726882782.72498: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882782.72907: done with get_vars() 22286 1726882782.72921: done getting variables 22286 1726882782.73000: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 21:39:42 -0400 (0:00:00.829) 0:00:06.123 ****** 22286 1726882782.73033: entering _queue_task() for managed_node3/set_fact 22286 1726882782.73388: worker is 1 (out of 1 available) 22286 1726882782.73399: exiting _queue_task() for managed_node3/set_fact 22286 1726882782.73590: done queuing things up, now waiting for results queue to drain 22286 1726882782.73592: waiting for pending results... 22286 1726882782.73853: running TaskExecutor() for managed_node3/TASK: Set current_interfaces 22286 1726882782.74040: in run() - task 0affe814-3a2d-a75d-4836-00000000013b 22286 1726882782.74043: variable 'ansible_search_path' from source: unknown 22286 1726882782.74047: variable 'ansible_search_path' from source: unknown 22286 1726882782.74050: calling self._execute() 22286 1726882782.74082: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882782.74098: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882782.74112: variable 'omit' from source: magic vars 22286 1726882782.74557: variable 'ansible_distribution_major_version' from source: facts 22286 1726882782.74571: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882782.74583: variable 'omit' from source: magic vars 22286 1726882782.74632: variable 'omit' from source: magic vars 22286 1726882782.74770: variable '_current_interfaces' from source: set_fact 22286 1726882782.74841: variable 'omit' from source: magic vars 22286 1726882782.74893: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22286 1726882782.74938: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22286 1726882782.74967: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22286 1726882782.74988: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882782.75003: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882782.75239: variable 'inventory_hostname' from source: host vars for 'managed_node3' 22286 1726882782.75243: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882782.75245: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882782.75248: Set connection var ansible_shell_executable to /bin/sh 22286 1726882782.75250: Set connection var ansible_module_compression to ZIP_DEFLATED 22286 1726882782.75252: Set connection var ansible_connection to ssh 22286 1726882782.75255: Set connection var ansible_shell_type to sh 22286 1726882782.75257: Set connection var ansible_timeout to 10 22286 1726882782.75259: Set connection var ansible_pipelining to False 22286 1726882782.75268: variable 'ansible_shell_executable' from source: unknown 22286 1726882782.75271: variable 'ansible_connection' from source: unknown 22286 1726882782.75278: variable 'ansible_module_compression' from source: unknown 22286 1726882782.75281: variable 'ansible_shell_type' from source: unknown 22286 1726882782.75283: variable 'ansible_shell_executable' from source: unknown 22286 1726882782.75299: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882782.75304: variable 'ansible_pipelining' from source: unknown 22286 1726882782.75308: variable 'ansible_timeout' from source: unknown 22286 1726882782.75314: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882782.75490: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22286 1726882782.75503: variable 'omit' from source: magic vars 22286 1726882782.75521: starting attempt loop 22286 1726882782.75525: running the handler 22286 1726882782.75542: handler run complete 22286 1726882782.75554: attempt loop complete, returning result 22286 1726882782.75557: _execute() done 22286 1726882782.75560: dumping result to json 22286 1726882782.75565: done dumping result, returning 22286 1726882782.75574: done running TaskExecutor() for managed_node3/TASK: Set current_interfaces [0affe814-3a2d-a75d-4836-00000000013b] 22286 1726882782.75579: sending task result for task 0affe814-3a2d-a75d-4836-00000000013b 22286 1726882782.75677: done sending task result for task 0affe814-3a2d-a75d-4836-00000000013b 22286 1726882782.75682: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 22286 1726882782.75766: no more pending results, returning what we have 22286 1726882782.75770: results queue empty 22286 1726882782.75771: checking for any_errors_fatal 22286 1726882782.75783: done checking for any_errors_fatal 22286 1726882782.75785: checking for max_fail_percentage 22286 1726882782.75787: done checking for max_fail_percentage 22286 1726882782.75788: checking to see if all hosts have failed and the running result is not ok 22286 1726882782.75789: done checking to see if all hosts have failed 22286 1726882782.75790: getting the remaining hosts for this loop 22286 1726882782.75792: done getting the remaining hosts for this loop 22286 1726882782.75797: getting the next task for host managed_node3 22286 1726882782.75808: done getting next task for host managed_node3 22286 1726882782.75811: ^ task is: TASK: Show current_interfaces 22286 1726882782.75815: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882782.75822: getting variables 22286 1726882782.75823: in VariableManager get_vars() 22286 1726882782.75877: Calling all_inventory to load vars for managed_node3 22286 1726882782.75880: Calling groups_inventory to load vars for managed_node3 22286 1726882782.75884: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882782.75898: Calling all_plugins_play to load vars for managed_node3 22286 1726882782.75902: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882782.75907: Calling groups_plugins_play to load vars for managed_node3 22286 1726882782.76401: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882782.76759: done with get_vars() 22286 1726882782.76773: done getting variables 22286 1726882782.76921: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 21:39:42 -0400 (0:00:00.039) 0:00:06.162 ****** 22286 1726882782.76958: entering _queue_task() for managed_node3/debug 22286 1726882782.76961: Creating lock for debug 22286 1726882782.77402: worker is 1 (out of 1 available) 22286 1726882782.77412: exiting _queue_task() for managed_node3/debug 22286 1726882782.77424: done queuing things up, now waiting for results queue to drain 22286 1726882782.77426: waiting for pending results... 22286 1726882782.77670: running TaskExecutor() for managed_node3/TASK: Show current_interfaces 22286 1726882782.77826: in run() - task 0affe814-3a2d-a75d-4836-00000000012c 22286 1726882782.77874: variable 'ansible_search_path' from source: unknown 22286 1726882782.77878: variable 'ansible_search_path' from source: unknown 22286 1726882782.77935: calling self._execute() 22286 1726882782.78096: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882782.78100: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882782.78196: variable 'omit' from source: magic vars 22286 1726882782.78626: variable 'ansible_distribution_major_version' from source: facts 22286 1726882782.78684: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882782.78721: variable 'omit' from source: magic vars 22286 1726882782.78778: variable 'omit' from source: magic vars 22286 1726882782.78953: variable 'current_interfaces' from source: set_fact 22286 1726882782.78996: variable 'omit' from source: magic vars 22286 1726882782.79071: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22286 1726882782.79117: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22286 1726882782.79199: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22286 1726882782.79248: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882782.79332: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882782.79375: variable 'inventory_hostname' from source: host vars for 'managed_node3' 22286 1726882782.79402: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882782.79449: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882782.79867: Set connection var ansible_shell_executable to /bin/sh 22286 1726882782.79890: Set connection var ansible_module_compression to ZIP_DEFLATED 22286 1726882782.79940: Set connection var ansible_connection to ssh 22286 1726882782.79943: Set connection var ansible_shell_type to sh 22286 1726882782.79946: Set connection var ansible_timeout to 10 22286 1726882782.79948: Set connection var ansible_pipelining to False 22286 1726882782.79993: variable 'ansible_shell_executable' from source: unknown 22286 1726882782.80003: variable 'ansible_connection' from source: unknown 22286 1726882782.80010: variable 'ansible_module_compression' from source: unknown 22286 1726882782.80016: variable 'ansible_shell_type' from source: unknown 22286 1726882782.80039: variable 'ansible_shell_executable' from source: unknown 22286 1726882782.80043: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882782.80046: variable 'ansible_pipelining' from source: unknown 22286 1726882782.80048: variable 'ansible_timeout' from source: unknown 22286 1726882782.80057: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882782.80298: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22286 1726882782.80301: variable 'omit' from source: magic vars 22286 1726882782.80304: starting attempt loop 22286 1726882782.80308: running the handler 22286 1726882782.80406: handler run complete 22286 1726882782.80410: attempt loop complete, returning result 22286 1726882782.80412: _execute() done 22286 1726882782.80422: dumping result to json 22286 1726882782.80432: done dumping result, returning 22286 1726882782.80449: done running TaskExecutor() for managed_node3/TASK: Show current_interfaces [0affe814-3a2d-a75d-4836-00000000012c] 22286 1726882782.80513: sending task result for task 0affe814-3a2d-a75d-4836-00000000012c 22286 1726882782.80708: done sending task result for task 0affe814-3a2d-a75d-4836-00000000012c 22286 1726882782.80712: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 22286 1726882782.80772: no more pending results, returning what we have 22286 1726882782.80776: results queue empty 22286 1726882782.80777: checking for any_errors_fatal 22286 1726882782.80781: done checking for any_errors_fatal 22286 1726882782.80783: checking for max_fail_percentage 22286 1726882782.80785: done checking for max_fail_percentage 22286 1726882782.80786: checking to see if all hosts have failed and the running result is not ok 22286 1726882782.80787: done checking to see if all hosts have failed 22286 1726882782.80788: getting the remaining hosts for this loop 22286 1726882782.80790: done getting the remaining hosts for this loop 22286 1726882782.80795: getting the next task for host managed_node3 22286 1726882782.80804: done getting next task for host managed_node3 22286 1726882782.80808: ^ task is: TASK: Include the task 'manage_test_interface.yml' 22286 1726882782.80811: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882782.80816: getting variables 22286 1726882782.80817: in VariableManager get_vars() 22286 1726882782.81017: Calling all_inventory to load vars for managed_node3 22286 1726882782.81021: Calling groups_inventory to load vars for managed_node3 22286 1726882782.81024: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882782.81105: Calling all_plugins_play to load vars for managed_node3 22286 1726882782.81110: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882782.81115: Calling groups_plugins_play to load vars for managed_node3 22286 1726882782.81522: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882782.81882: done with get_vars() 22286 1726882782.81893: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:11 Friday 20 September 2024 21:39:42 -0400 (0:00:00.050) 0:00:06.213 ****** 22286 1726882782.82011: entering _queue_task() for managed_node3/include_tasks 22286 1726882782.82279: worker is 1 (out of 1 available) 22286 1726882782.82443: exiting _queue_task() for managed_node3/include_tasks 22286 1726882782.82455: done queuing things up, now waiting for results queue to drain 22286 1726882782.82456: waiting for pending results... 22286 1726882782.82765: running TaskExecutor() for managed_node3/TASK: Include the task 'manage_test_interface.yml' 22286 1726882782.82770: in run() - task 0affe814-3a2d-a75d-4836-00000000000c 22286 1726882782.82774: variable 'ansible_search_path' from source: unknown 22286 1726882782.82777: calling self._execute() 22286 1726882782.82868: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882782.82883: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882782.82909: variable 'omit' from source: magic vars 22286 1726882782.83387: variable 'ansible_distribution_major_version' from source: facts 22286 1726882782.83472: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882782.83475: _execute() done 22286 1726882782.83478: dumping result to json 22286 1726882782.83481: done dumping result, returning 22286 1726882782.83484: done running TaskExecutor() for managed_node3/TASK: Include the task 'manage_test_interface.yml' [0affe814-3a2d-a75d-4836-00000000000c] 22286 1726882782.83486: sending task result for task 0affe814-3a2d-a75d-4836-00000000000c 22286 1726882782.83730: no more pending results, returning what we have 22286 1726882782.83736: in VariableManager get_vars() 22286 1726882782.83781: Calling all_inventory to load vars for managed_node3 22286 1726882782.83785: Calling groups_inventory to load vars for managed_node3 22286 1726882782.83788: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882782.83801: Calling all_plugins_play to load vars for managed_node3 22286 1726882782.83805: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882782.83809: Calling groups_plugins_play to load vars for managed_node3 22286 1726882782.84162: done sending task result for task 0affe814-3a2d-a75d-4836-00000000000c 22286 1726882782.84166: WORKER PROCESS EXITING 22286 1726882782.84193: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882782.84536: done with get_vars() 22286 1726882782.84545: variable 'ansible_search_path' from source: unknown 22286 1726882782.84558: we have included files to process 22286 1726882782.84559: generating all_blocks data 22286 1726882782.84561: done generating all_blocks data 22286 1726882782.84567: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 22286 1726882782.84569: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 22286 1726882782.84572: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 22286 1726882782.85398: in VariableManager get_vars() 22286 1726882782.85425: done with get_vars() 22286 1726882782.85817: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 22286 1726882782.86656: done processing included file 22286 1726882782.86659: iterating over new_blocks loaded from include file 22286 1726882782.86660: in VariableManager get_vars() 22286 1726882782.86690: done with get_vars() 22286 1726882782.86693: filtering new block on tags 22286 1726882782.86736: done filtering new block on tags 22286 1726882782.86739: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed_node3 22286 1726882782.86952: extending task lists for all hosts with included blocks 22286 1726882782.87174: done extending task lists 22286 1726882782.87176: done processing included files 22286 1726882782.87177: results queue empty 22286 1726882782.87178: checking for any_errors_fatal 22286 1726882782.87181: done checking for any_errors_fatal 22286 1726882782.87182: checking for max_fail_percentage 22286 1726882782.87183: done checking for max_fail_percentage 22286 1726882782.87184: checking to see if all hosts have failed and the running result is not ok 22286 1726882782.87185: done checking to see if all hosts have failed 22286 1726882782.87186: getting the remaining hosts for this loop 22286 1726882782.87187: done getting the remaining hosts for this loop 22286 1726882782.87190: getting the next task for host managed_node3 22286 1726882782.87195: done getting next task for host managed_node3 22286 1726882782.87197: ^ task is: TASK: Ensure state in ["present", "absent"] 22286 1726882782.87200: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882782.87202: getting variables 22286 1726882782.87204: in VariableManager get_vars() 22286 1726882782.87223: Calling all_inventory to load vars for managed_node3 22286 1726882782.87226: Calling groups_inventory to load vars for managed_node3 22286 1726882782.87234: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882782.87248: Calling all_plugins_play to load vars for managed_node3 22286 1726882782.87257: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882782.87262: Calling groups_plugins_play to load vars for managed_node3 22286 1726882782.87483: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882782.87804: done with get_vars() 22286 1726882782.87815: done getting variables 22286 1726882782.87899: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Friday 20 September 2024 21:39:42 -0400 (0:00:00.059) 0:00:06.272 ****** 22286 1726882782.87939: entering _queue_task() for managed_node3/fail 22286 1726882782.87941: Creating lock for fail 22286 1726882782.88332: worker is 1 (out of 1 available) 22286 1726882782.88344: exiting _queue_task() for managed_node3/fail 22286 1726882782.88360: done queuing things up, now waiting for results queue to drain 22286 1726882782.88361: waiting for pending results... 22286 1726882782.88542: running TaskExecutor() for managed_node3/TASK: Ensure state in ["present", "absent"] 22286 1726882782.88614: in run() - task 0affe814-3a2d-a75d-4836-000000000156 22286 1726882782.88627: variable 'ansible_search_path' from source: unknown 22286 1726882782.88631: variable 'ansible_search_path' from source: unknown 22286 1726882782.88666: calling self._execute() 22286 1726882782.88739: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882782.88746: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882782.88758: variable 'omit' from source: magic vars 22286 1726882782.89070: variable 'ansible_distribution_major_version' from source: facts 22286 1726882782.89082: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882782.89197: variable 'state' from source: include params 22286 1726882782.89203: Evaluated conditional (state not in ["present", "absent"]): False 22286 1726882782.89206: when evaluation is False, skipping this task 22286 1726882782.89209: _execute() done 22286 1726882782.89217: dumping result to json 22286 1726882782.89220: done dumping result, returning 22286 1726882782.89223: done running TaskExecutor() for managed_node3/TASK: Ensure state in ["present", "absent"] [0affe814-3a2d-a75d-4836-000000000156] 22286 1726882782.89231: sending task result for task 0affe814-3a2d-a75d-4836-000000000156 22286 1726882782.89337: done sending task result for task 0affe814-3a2d-a75d-4836-000000000156 22286 1726882782.89340: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 22286 1726882782.89395: no more pending results, returning what we have 22286 1726882782.89399: results queue empty 22286 1726882782.89399: checking for any_errors_fatal 22286 1726882782.89401: done checking for any_errors_fatal 22286 1726882782.89402: checking for max_fail_percentage 22286 1726882782.89404: done checking for max_fail_percentage 22286 1726882782.89404: checking to see if all hosts have failed and the running result is not ok 22286 1726882782.89405: done checking to see if all hosts have failed 22286 1726882782.89406: getting the remaining hosts for this loop 22286 1726882782.89408: done getting the remaining hosts for this loop 22286 1726882782.89412: getting the next task for host managed_node3 22286 1726882782.89417: done getting next task for host managed_node3 22286 1726882782.89420: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 22286 1726882782.89422: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882782.89426: getting variables 22286 1726882782.89435: in VariableManager get_vars() 22286 1726882782.89469: Calling all_inventory to load vars for managed_node3 22286 1726882782.89472: Calling groups_inventory to load vars for managed_node3 22286 1726882782.89473: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882782.89483: Calling all_plugins_play to load vars for managed_node3 22286 1726882782.89485: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882782.89487: Calling groups_plugins_play to load vars for managed_node3 22286 1726882782.89664: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882782.89844: done with get_vars() 22286 1726882782.89852: done getting variables 22286 1726882782.89898: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Friday 20 September 2024 21:39:42 -0400 (0:00:00.019) 0:00:06.292 ****** 22286 1726882782.89919: entering _queue_task() for managed_node3/fail 22286 1726882782.90106: worker is 1 (out of 1 available) 22286 1726882782.90119: exiting _queue_task() for managed_node3/fail 22286 1726882782.90299: done queuing things up, now waiting for results queue to drain 22286 1726882782.90301: waiting for pending results... 22286 1726882782.90323: running TaskExecutor() for managed_node3/TASK: Ensure type in ["dummy", "tap", "veth"] 22286 1726882782.90471: in run() - task 0affe814-3a2d-a75d-4836-000000000157 22286 1726882782.90478: variable 'ansible_search_path' from source: unknown 22286 1726882782.90482: variable 'ansible_search_path' from source: unknown 22286 1726882782.90486: calling self._execute() 22286 1726882782.90689: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882782.90693: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882782.90697: variable 'omit' from source: magic vars 22286 1726882782.91052: variable 'ansible_distribution_major_version' from source: facts 22286 1726882782.91065: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882782.91264: variable 'type' from source: play vars 22286 1726882782.91287: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 22286 1726882782.91291: when evaluation is False, skipping this task 22286 1726882782.91294: _execute() done 22286 1726882782.91296: dumping result to json 22286 1726882782.91299: done dumping result, returning 22286 1726882782.91302: done running TaskExecutor() for managed_node3/TASK: Ensure type in ["dummy", "tap", "veth"] [0affe814-3a2d-a75d-4836-000000000157] 22286 1726882782.91304: sending task result for task 0affe814-3a2d-a75d-4836-000000000157 22286 1726882782.91407: done sending task result for task 0affe814-3a2d-a75d-4836-000000000157 22286 1726882782.91410: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 22286 1726882782.91503: no more pending results, returning what we have 22286 1726882782.91506: results queue empty 22286 1726882782.91507: checking for any_errors_fatal 22286 1726882782.91512: done checking for any_errors_fatal 22286 1726882782.91513: checking for max_fail_percentage 22286 1726882782.91514: done checking for max_fail_percentage 22286 1726882782.91515: checking to see if all hosts have failed and the running result is not ok 22286 1726882782.91516: done checking to see if all hosts have failed 22286 1726882782.91517: getting the remaining hosts for this loop 22286 1726882782.91518: done getting the remaining hosts for this loop 22286 1726882782.91521: getting the next task for host managed_node3 22286 1726882782.91527: done getting next task for host managed_node3 22286 1726882782.91529: ^ task is: TASK: Include the task 'show_interfaces.yml' 22286 1726882782.91532: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882782.91537: getting variables 22286 1726882782.91538: in VariableManager get_vars() 22286 1726882782.91573: Calling all_inventory to load vars for managed_node3 22286 1726882782.91579: Calling groups_inventory to load vars for managed_node3 22286 1726882782.91583: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882782.91595: Calling all_plugins_play to load vars for managed_node3 22286 1726882782.91602: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882782.91607: Calling groups_plugins_play to load vars for managed_node3 22286 1726882782.91766: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882782.91946: done with get_vars() 22286 1726882782.91954: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Friday 20 September 2024 21:39:42 -0400 (0:00:00.021) 0:00:06.313 ****** 22286 1726882782.92043: entering _queue_task() for managed_node3/include_tasks 22286 1726882782.92284: worker is 1 (out of 1 available) 22286 1726882782.92295: exiting _queue_task() for managed_node3/include_tasks 22286 1726882782.92307: done queuing things up, now waiting for results queue to drain 22286 1726882782.92309: waiting for pending results... 22286 1726882782.92587: running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' 22286 1726882782.92685: in run() - task 0affe814-3a2d-a75d-4836-000000000158 22286 1726882782.92688: variable 'ansible_search_path' from source: unknown 22286 1726882782.92692: variable 'ansible_search_path' from source: unknown 22286 1726882782.92793: calling self._execute() 22286 1726882782.92848: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882782.92861: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882782.92881: variable 'omit' from source: magic vars 22286 1726882782.93356: variable 'ansible_distribution_major_version' from source: facts 22286 1726882782.93366: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882782.93372: _execute() done 22286 1726882782.93378: dumping result to json 22286 1726882782.93381: done dumping result, returning 22286 1726882782.93385: done running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' [0affe814-3a2d-a75d-4836-000000000158] 22286 1726882782.93397: sending task result for task 0affe814-3a2d-a75d-4836-000000000158 22286 1726882782.93487: done sending task result for task 0affe814-3a2d-a75d-4836-000000000158 22286 1726882782.93490: WORKER PROCESS EXITING 22286 1726882782.93518: no more pending results, returning what we have 22286 1726882782.93522: in VariableManager get_vars() 22286 1726882782.93565: Calling all_inventory to load vars for managed_node3 22286 1726882782.93568: Calling groups_inventory to load vars for managed_node3 22286 1726882782.93571: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882782.93587: Calling all_plugins_play to load vars for managed_node3 22286 1726882782.93590: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882782.93593: Calling groups_plugins_play to load vars for managed_node3 22286 1726882782.93872: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882782.94191: done with get_vars() 22286 1726882782.94200: variable 'ansible_search_path' from source: unknown 22286 1726882782.94202: variable 'ansible_search_path' from source: unknown 22286 1726882782.94244: we have included files to process 22286 1726882782.94245: generating all_blocks data 22286 1726882782.94248: done generating all_blocks data 22286 1726882782.94252: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 22286 1726882782.94253: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 22286 1726882782.94256: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 22286 1726882782.94378: in VariableManager get_vars() 22286 1726882782.94407: done with get_vars() 22286 1726882782.94543: done processing included file 22286 1726882782.94546: iterating over new_blocks loaded from include file 22286 1726882782.94547: in VariableManager get_vars() 22286 1726882782.94571: done with get_vars() 22286 1726882782.94573: filtering new block on tags 22286 1726882782.94598: done filtering new block on tags 22286 1726882782.94601: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node3 22286 1726882782.94607: extending task lists for all hosts with included blocks 22286 1726882782.95201: done extending task lists 22286 1726882782.95202: done processing included files 22286 1726882782.95203: results queue empty 22286 1726882782.95204: checking for any_errors_fatal 22286 1726882782.95207: done checking for any_errors_fatal 22286 1726882782.95208: checking for max_fail_percentage 22286 1726882782.95210: done checking for max_fail_percentage 22286 1726882782.95210: checking to see if all hosts have failed and the running result is not ok 22286 1726882782.95211: done checking to see if all hosts have failed 22286 1726882782.95212: getting the remaining hosts for this loop 22286 1726882782.95214: done getting the remaining hosts for this loop 22286 1726882782.95217: getting the next task for host managed_node3 22286 1726882782.95224: done getting next task for host managed_node3 22286 1726882782.95232: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 22286 1726882782.95237: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882782.95261: getting variables 22286 1726882782.95264: in VariableManager get_vars() 22286 1726882782.95298: Calling all_inventory to load vars for managed_node3 22286 1726882782.95302: Calling groups_inventory to load vars for managed_node3 22286 1726882782.95308: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882782.95314: Calling all_plugins_play to load vars for managed_node3 22286 1726882782.95318: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882782.95322: Calling groups_plugins_play to load vars for managed_node3 22286 1726882782.95577: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882782.96089: done with get_vars() 22286 1726882782.96100: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 21:39:42 -0400 (0:00:00.041) 0:00:06.355 ****** 22286 1726882782.96187: entering _queue_task() for managed_node3/include_tasks 22286 1726882782.96414: worker is 1 (out of 1 available) 22286 1726882782.96425: exiting _queue_task() for managed_node3/include_tasks 22286 1726882782.96450: done queuing things up, now waiting for results queue to drain 22286 1726882782.96452: waiting for pending results... 22286 1726882782.96852: running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' 22286 1726882782.96859: in run() - task 0affe814-3a2d-a75d-4836-00000000017f 22286 1726882782.96862: variable 'ansible_search_path' from source: unknown 22286 1726882782.96866: variable 'ansible_search_path' from source: unknown 22286 1726882782.96887: calling self._execute() 22286 1726882782.96997: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882782.97017: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882782.97038: variable 'omit' from source: magic vars 22286 1726882782.97472: variable 'ansible_distribution_major_version' from source: facts 22286 1726882782.97494: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882782.97505: _execute() done 22286 1726882782.97513: dumping result to json 22286 1726882782.97521: done dumping result, returning 22286 1726882782.97740: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' [0affe814-3a2d-a75d-4836-00000000017f] 22286 1726882782.97744: sending task result for task 0affe814-3a2d-a75d-4836-00000000017f 22286 1726882782.97813: done sending task result for task 0affe814-3a2d-a75d-4836-00000000017f 22286 1726882782.97817: WORKER PROCESS EXITING 22286 1726882782.97844: no more pending results, returning what we have 22286 1726882782.97848: in VariableManager get_vars() 22286 1726882782.97889: Calling all_inventory to load vars for managed_node3 22286 1726882782.97893: Calling groups_inventory to load vars for managed_node3 22286 1726882782.97896: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882782.97906: Calling all_plugins_play to load vars for managed_node3 22286 1726882782.97909: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882782.97913: Calling groups_plugins_play to load vars for managed_node3 22286 1726882782.98390: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882782.99192: done with get_vars() 22286 1726882782.99202: variable 'ansible_search_path' from source: unknown 22286 1726882782.99203: variable 'ansible_search_path' from source: unknown 22286 1726882782.99318: we have included files to process 22286 1726882782.99320: generating all_blocks data 22286 1726882782.99322: done generating all_blocks data 22286 1726882782.99323: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 22286 1726882782.99325: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 22286 1726882782.99327: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 22286 1726882783.00052: done processing included file 22286 1726882783.00054: iterating over new_blocks loaded from include file 22286 1726882783.00056: in VariableManager get_vars() 22286 1726882783.00078: done with get_vars() 22286 1726882783.00120: filtering new block on tags 22286 1726882783.00145: done filtering new block on tags 22286 1726882783.00148: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node3 22286 1726882783.00153: extending task lists for all hosts with included blocks 22286 1726882783.00622: done extending task lists 22286 1726882783.00624: done processing included files 22286 1726882783.00625: results queue empty 22286 1726882783.00626: checking for any_errors_fatal 22286 1726882783.00629: done checking for any_errors_fatal 22286 1726882783.00630: checking for max_fail_percentage 22286 1726882783.00632: done checking for max_fail_percentage 22286 1726882783.00633: checking to see if all hosts have failed and the running result is not ok 22286 1726882783.00637: done checking to see if all hosts have failed 22286 1726882783.00638: getting the remaining hosts for this loop 22286 1726882783.00639: done getting the remaining hosts for this loop 22286 1726882783.00643: getting the next task for host managed_node3 22286 1726882783.00648: done getting next task for host managed_node3 22286 1726882783.00650: ^ task is: TASK: Gather current interface info 22286 1726882783.00654: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882783.00657: getting variables 22286 1726882783.00658: in VariableManager get_vars() 22286 1726882783.00788: Calling all_inventory to load vars for managed_node3 22286 1726882783.00792: Calling groups_inventory to load vars for managed_node3 22286 1726882783.00794: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882783.00800: Calling all_plugins_play to load vars for managed_node3 22286 1726882783.00803: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882783.00806: Calling groups_plugins_play to load vars for managed_node3 22286 1726882783.01191: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882783.01703: done with get_vars() 22286 1726882783.01714: done getting variables 22286 1726882783.01794: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 21:39:43 -0400 (0:00:00.056) 0:00:06.411 ****** 22286 1726882783.01846: entering _queue_task() for managed_node3/command 22286 1726882783.02587: worker is 1 (out of 1 available) 22286 1726882783.02600: exiting _queue_task() for managed_node3/command 22286 1726882783.02617: done queuing things up, now waiting for results queue to drain 22286 1726882783.02619: waiting for pending results... 22286 1726882783.03182: running TaskExecutor() for managed_node3/TASK: Gather current interface info 22286 1726882783.03426: in run() - task 0affe814-3a2d-a75d-4836-0000000001b6 22286 1726882783.03442: variable 'ansible_search_path' from source: unknown 22286 1726882783.03446: variable 'ansible_search_path' from source: unknown 22286 1726882783.03603: calling self._execute() 22286 1726882783.03943: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882783.03948: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882783.03951: variable 'omit' from source: magic vars 22286 1726882783.04440: variable 'ansible_distribution_major_version' from source: facts 22286 1726882783.04444: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882783.04447: variable 'omit' from source: magic vars 22286 1726882783.04450: variable 'omit' from source: magic vars 22286 1726882783.04497: variable 'omit' from source: magic vars 22286 1726882783.04545: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22286 1726882783.04598: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22286 1726882783.04622: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22286 1726882783.04648: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882783.04662: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882783.04709: variable 'inventory_hostname' from source: host vars for 'managed_node3' 22286 1726882783.04712: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882783.04718: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882783.04940: Set connection var ansible_shell_executable to /bin/sh 22286 1726882783.04944: Set connection var ansible_module_compression to ZIP_DEFLATED 22286 1726882783.04947: Set connection var ansible_connection to ssh 22286 1726882783.04949: Set connection var ansible_shell_type to sh 22286 1726882783.04952: Set connection var ansible_timeout to 10 22286 1726882783.04954: Set connection var ansible_pipelining to False 22286 1726882783.04957: variable 'ansible_shell_executable' from source: unknown 22286 1726882783.04959: variable 'ansible_connection' from source: unknown 22286 1726882783.04962: variable 'ansible_module_compression' from source: unknown 22286 1726882783.04964: variable 'ansible_shell_type' from source: unknown 22286 1726882783.04966: variable 'ansible_shell_executable' from source: unknown 22286 1726882783.04968: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882783.04970: variable 'ansible_pipelining' from source: unknown 22286 1726882783.04972: variable 'ansible_timeout' from source: unknown 22286 1726882783.04974: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882783.05341: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22286 1726882783.05346: variable 'omit' from source: magic vars 22286 1726882783.05348: starting attempt loop 22286 1726882783.05351: running the handler 22286 1726882783.05353: _low_level_execute_command(): starting 22286 1726882783.05356: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22286 1726882783.06140: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882783.06241: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882783.06246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882783.06286: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882783.06339: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882783.06446: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882783.06592: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882783.08430: stdout chunk (state=3): >>>/root <<< 22286 1726882783.08592: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882783.08683: stderr chunk (state=3): >>><<< 22286 1726882783.08699: stdout chunk (state=3): >>><<< 22286 1726882783.08725: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882783.08751: _low_level_execute_command(): starting 22286 1726882783.08763: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882783.0873652-22547-131755842735098 `" && echo ansible-tmp-1726882783.0873652-22547-131755842735098="` echo /root/.ansible/tmp/ansible-tmp-1726882783.0873652-22547-131755842735098 `" ) && sleep 0' 22286 1726882783.09374: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882783.09393: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882783.09448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882783.09525: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882783.09550: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882783.09567: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882783.09728: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882783.11861: stdout chunk (state=3): >>>ansible-tmp-1726882783.0873652-22547-131755842735098=/root/.ansible/tmp/ansible-tmp-1726882783.0873652-22547-131755842735098 <<< 22286 1726882783.12042: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882783.12057: stdout chunk (state=3): >>><<< 22286 1726882783.12068: stderr chunk (state=3): >>><<< 22286 1726882783.12093: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882783.0873652-22547-131755842735098=/root/.ansible/tmp/ansible-tmp-1726882783.0873652-22547-131755842735098 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882783.12128: variable 'ansible_module_compression' from source: unknown 22286 1726882783.12196: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22286nvkga28q/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 22286 1726882783.12240: variable 'ansible_facts' from source: unknown 22286 1726882783.12349: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882783.0873652-22547-131755842735098/AnsiballZ_command.py 22286 1726882783.12558: Sending initial data 22286 1726882783.12571: Sent initial data (156 bytes) 22286 1726882783.13172: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882783.13232: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882783.13314: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882783.13340: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882783.13363: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882783.13503: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882783.15211: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22286 1726882783.15372: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22286 1726882783.15489: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22286nvkga28q/tmpj0rkdy3i /root/.ansible/tmp/ansible-tmp-1726882783.0873652-22547-131755842735098/AnsiballZ_command.py <<< 22286 1726882783.15493: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882783.0873652-22547-131755842735098/AnsiballZ_command.py" <<< 22286 1726882783.15582: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-22286nvkga28q/tmpj0rkdy3i" to remote "/root/.ansible/tmp/ansible-tmp-1726882783.0873652-22547-131755842735098/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882783.0873652-22547-131755842735098/AnsiballZ_command.py" <<< 22286 1726882783.17058: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882783.17118: stderr chunk (state=3): >>><<< 22286 1726882783.17121: stdout chunk (state=3): >>><<< 22286 1726882783.17146: done transferring module to remote 22286 1726882783.17157: _low_level_execute_command(): starting 22286 1726882783.17162: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882783.0873652-22547-131755842735098/ /root/.ansible/tmp/ansible-tmp-1726882783.0873652-22547-131755842735098/AnsiballZ_command.py && sleep 0' 22286 1726882783.17616: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882783.17619: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22286 1726882783.17622: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882783.17625: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882783.17627: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882783.17686: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882783.17690: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882783.17809: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882783.19723: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882783.19777: stderr chunk (state=3): >>><<< 22286 1726882783.19780: stdout chunk (state=3): >>><<< 22286 1726882783.19791: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882783.19840: _low_level_execute_command(): starting 22286 1726882783.19844: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882783.0873652-22547-131755842735098/AnsiballZ_command.py && sleep 0' 22286 1726882783.20200: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882783.20240: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 <<< 22286 1726882783.20243: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882783.20246: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882783.20248: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882783.20288: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882783.20304: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882783.20422: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882783.37904: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:39:43.373632", "end": "2024-09-20 21:39:43.377151", "delta": "0:00:00.003519", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 22286 1726882783.39463: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. <<< 22286 1726882783.39526: stderr chunk (state=3): >>><<< 22286 1726882783.39530: stdout chunk (state=3): >>><<< 22286 1726882783.39550: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:39:43.373632", "end": "2024-09-20 21:39:43.377151", "delta": "0:00:00.003519", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. 22286 1726882783.39590: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882783.0873652-22547-131755842735098/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22286 1726882783.39598: _low_level_execute_command(): starting 22286 1726882783.39604: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882783.0873652-22547-131755842735098/ > /dev/null 2>&1 && sleep 0' 22286 1726882783.40131: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882783.40137: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882783.40139: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882783.40142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882783.40199: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882783.40206: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882783.40320: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882783.42554: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882783.42558: stdout chunk (state=3): >>><<< 22286 1726882783.42561: stderr chunk (state=3): >>><<< 22286 1726882783.42563: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882783.42565: handler run complete 22286 1726882783.42568: Evaluated conditional (False): False 22286 1726882783.42570: attempt loop complete, returning result 22286 1726882783.42572: _execute() done 22286 1726882783.42574: dumping result to json 22286 1726882783.42576: done dumping result, returning 22286 1726882783.42577: done running TaskExecutor() for managed_node3/TASK: Gather current interface info [0affe814-3a2d-a75d-4836-0000000001b6] 22286 1726882783.42579: sending task result for task 0affe814-3a2d-a75d-4836-0000000001b6 ok: [managed_node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003519", "end": "2024-09-20 21:39:43.377151", "rc": 0, "start": "2024-09-20 21:39:43.373632" } STDOUT: bonding_masters eth0 lo 22286 1726882783.42851: no more pending results, returning what we have 22286 1726882783.42855: results queue empty 22286 1726882783.42856: checking for any_errors_fatal 22286 1726882783.42858: done checking for any_errors_fatal 22286 1726882783.42859: checking for max_fail_percentage 22286 1726882783.42861: done checking for max_fail_percentage 22286 1726882783.42862: checking to see if all hosts have failed and the running result is not ok 22286 1726882783.42870: done checking to see if all hosts have failed 22286 1726882783.42871: getting the remaining hosts for this loop 22286 1726882783.42873: done getting the remaining hosts for this loop 22286 1726882783.42881: getting the next task for host managed_node3 22286 1726882783.42891: done getting next task for host managed_node3 22286 1726882783.42893: ^ task is: TASK: Set current_interfaces 22286 1726882783.42900: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882783.42906: getting variables 22286 1726882783.42907: in VariableManager get_vars() 22286 1726882783.43060: Calling all_inventory to load vars for managed_node3 22286 1726882783.43065: Calling groups_inventory to load vars for managed_node3 22286 1726882783.43068: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882783.43083: Calling all_plugins_play to load vars for managed_node3 22286 1726882783.43144: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882783.43149: Calling groups_plugins_play to load vars for managed_node3 22286 1726882783.43403: done sending task result for task 0affe814-3a2d-a75d-4836-0000000001b6 22286 1726882783.43407: WORKER PROCESS EXITING 22286 1726882783.43424: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882783.43642: done with get_vars() 22286 1726882783.43652: done getting variables 22286 1726882783.43703: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 21:39:43 -0400 (0:00:00.418) 0:00:06.830 ****** 22286 1726882783.43727: entering _queue_task() for managed_node3/set_fact 22286 1726882783.43941: worker is 1 (out of 1 available) 22286 1726882783.43954: exiting _queue_task() for managed_node3/set_fact 22286 1726882783.43967: done queuing things up, now waiting for results queue to drain 22286 1726882783.43968: waiting for pending results... 22286 1726882783.44131: running TaskExecutor() for managed_node3/TASK: Set current_interfaces 22286 1726882783.44219: in run() - task 0affe814-3a2d-a75d-4836-0000000001b7 22286 1726882783.44231: variable 'ansible_search_path' from source: unknown 22286 1726882783.44236: variable 'ansible_search_path' from source: unknown 22286 1726882783.44269: calling self._execute() 22286 1726882783.44342: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882783.44349: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882783.44359: variable 'omit' from source: magic vars 22286 1726882783.44682: variable 'ansible_distribution_major_version' from source: facts 22286 1726882783.44692: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882783.44700: variable 'omit' from source: magic vars 22286 1726882783.44743: variable 'omit' from source: magic vars 22286 1726882783.44830: variable '_current_interfaces' from source: set_fact 22286 1726882783.44886: variable 'omit' from source: magic vars 22286 1726882783.44919: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22286 1726882783.44952: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22286 1726882783.44973: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22286 1726882783.44990: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882783.45001: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882783.45028: variable 'inventory_hostname' from source: host vars for 'managed_node3' 22286 1726882783.45031: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882783.45037: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882783.45123: Set connection var ansible_shell_executable to /bin/sh 22286 1726882783.45132: Set connection var ansible_module_compression to ZIP_DEFLATED 22286 1726882783.45137: Set connection var ansible_connection to ssh 22286 1726882783.45139: Set connection var ansible_shell_type to sh 22286 1726882783.45146: Set connection var ansible_timeout to 10 22286 1726882783.45154: Set connection var ansible_pipelining to False 22286 1726882783.45178: variable 'ansible_shell_executable' from source: unknown 22286 1726882783.45182: variable 'ansible_connection' from source: unknown 22286 1726882783.45187: variable 'ansible_module_compression' from source: unknown 22286 1726882783.45189: variable 'ansible_shell_type' from source: unknown 22286 1726882783.45192: variable 'ansible_shell_executable' from source: unknown 22286 1726882783.45194: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882783.45196: variable 'ansible_pipelining' from source: unknown 22286 1726882783.45198: variable 'ansible_timeout' from source: unknown 22286 1726882783.45439: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882783.45444: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22286 1726882783.45447: variable 'omit' from source: magic vars 22286 1726882783.45449: starting attempt loop 22286 1726882783.45451: running the handler 22286 1726882783.45453: handler run complete 22286 1726882783.45456: attempt loop complete, returning result 22286 1726882783.45467: _execute() done 22286 1726882783.45478: dumping result to json 22286 1726882783.45487: done dumping result, returning 22286 1726882783.45499: done running TaskExecutor() for managed_node3/TASK: Set current_interfaces [0affe814-3a2d-a75d-4836-0000000001b7] 22286 1726882783.45510: sending task result for task 0affe814-3a2d-a75d-4836-0000000001b7 ok: [managed_node3] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 22286 1726882783.45715: no more pending results, returning what we have 22286 1726882783.45719: results queue empty 22286 1726882783.45720: checking for any_errors_fatal 22286 1726882783.45733: done checking for any_errors_fatal 22286 1726882783.45736: checking for max_fail_percentage 22286 1726882783.45738: done checking for max_fail_percentage 22286 1726882783.45740: checking to see if all hosts have failed and the running result is not ok 22286 1726882783.45741: done checking to see if all hosts have failed 22286 1726882783.45742: getting the remaining hosts for this loop 22286 1726882783.45744: done getting the remaining hosts for this loop 22286 1726882783.45749: getting the next task for host managed_node3 22286 1726882783.45758: done getting next task for host managed_node3 22286 1726882783.45761: ^ task is: TASK: Show current_interfaces 22286 1726882783.45767: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882783.45772: getting variables 22286 1726882783.45773: in VariableManager get_vars() 22286 1726882783.45978: Calling all_inventory to load vars for managed_node3 22286 1726882783.45981: Calling groups_inventory to load vars for managed_node3 22286 1726882783.45985: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882783.45996: Calling all_plugins_play to load vars for managed_node3 22286 1726882783.46000: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882783.46004: Calling groups_plugins_play to load vars for managed_node3 22286 1726882783.46333: done sending task result for task 0affe814-3a2d-a75d-4836-0000000001b7 22286 1726882783.46338: WORKER PROCESS EXITING 22286 1726882783.46364: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882783.46694: done with get_vars() 22286 1726882783.46705: done getting variables 22286 1726882783.46773: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 21:39:43 -0400 (0:00:00.030) 0:00:06.861 ****** 22286 1726882783.46809: entering _queue_task() for managed_node3/debug 22286 1726882783.47167: worker is 1 (out of 1 available) 22286 1726882783.47181: exiting _queue_task() for managed_node3/debug 22286 1726882783.47193: done queuing things up, now waiting for results queue to drain 22286 1726882783.47195: waiting for pending results... 22286 1726882783.47381: running TaskExecutor() for managed_node3/TASK: Show current_interfaces 22286 1726882783.47609: in run() - task 0affe814-3a2d-a75d-4836-000000000180 22286 1726882783.47613: variable 'ansible_search_path' from source: unknown 22286 1726882783.47615: variable 'ansible_search_path' from source: unknown 22286 1726882783.47618: calling self._execute() 22286 1726882783.47716: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882783.47729: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882783.47751: variable 'omit' from source: magic vars 22286 1726882783.48246: variable 'ansible_distribution_major_version' from source: facts 22286 1726882783.48272: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882783.48293: variable 'omit' from source: magic vars 22286 1726882783.48482: variable 'omit' from source: magic vars 22286 1726882783.48510: variable 'current_interfaces' from source: set_fact 22286 1726882783.48548: variable 'omit' from source: magic vars 22286 1726882783.48611: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22286 1726882783.48662: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22286 1726882783.48700: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22286 1726882783.48735: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882783.48755: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882783.48798: variable 'inventory_hostname' from source: host vars for 'managed_node3' 22286 1726882783.48815: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882783.48828: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882783.48969: Set connection var ansible_shell_executable to /bin/sh 22286 1726882783.48988: Set connection var ansible_module_compression to ZIP_DEFLATED 22286 1726882783.48996: Set connection var ansible_connection to ssh 22286 1726882783.49004: Set connection var ansible_shell_type to sh 22286 1726882783.49029: Set connection var ansible_timeout to 10 22286 1726882783.49032: Set connection var ansible_pipelining to False 22286 1726882783.49139: variable 'ansible_shell_executable' from source: unknown 22286 1726882783.49145: variable 'ansible_connection' from source: unknown 22286 1726882783.49147: variable 'ansible_module_compression' from source: unknown 22286 1726882783.49149: variable 'ansible_shell_type' from source: unknown 22286 1726882783.49151: variable 'ansible_shell_executable' from source: unknown 22286 1726882783.49153: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882783.49155: variable 'ansible_pipelining' from source: unknown 22286 1726882783.49156: variable 'ansible_timeout' from source: unknown 22286 1726882783.49158: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882783.49288: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22286 1726882783.49340: variable 'omit' from source: magic vars 22286 1726882783.49343: starting attempt loop 22286 1726882783.49350: running the handler 22286 1726882783.49392: handler run complete 22286 1726882783.49414: attempt loop complete, returning result 22286 1726882783.49422: _execute() done 22286 1726882783.49430: dumping result to json 22286 1726882783.49460: done dumping result, returning 22286 1726882783.49464: done running TaskExecutor() for managed_node3/TASK: Show current_interfaces [0affe814-3a2d-a75d-4836-000000000180] 22286 1726882783.49471: sending task result for task 0affe814-3a2d-a75d-4836-000000000180 ok: [managed_node3] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 22286 1726882783.49731: no more pending results, returning what we have 22286 1726882783.49739: results queue empty 22286 1726882783.49741: checking for any_errors_fatal 22286 1726882783.49746: done checking for any_errors_fatal 22286 1726882783.49747: checking for max_fail_percentage 22286 1726882783.49749: done checking for max_fail_percentage 22286 1726882783.49751: checking to see if all hosts have failed and the running result is not ok 22286 1726882783.49752: done checking to see if all hosts have failed 22286 1726882783.49753: getting the remaining hosts for this loop 22286 1726882783.49754: done getting the remaining hosts for this loop 22286 1726882783.49759: getting the next task for host managed_node3 22286 1726882783.49768: done getting next task for host managed_node3 22286 1726882783.49772: ^ task is: TASK: Install iproute 22286 1726882783.49778: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882783.49785: getting variables 22286 1726882783.49786: in VariableManager get_vars() 22286 1726882783.49830: Calling all_inventory to load vars for managed_node3 22286 1726882783.49940: Calling groups_inventory to load vars for managed_node3 22286 1726882783.49945: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882783.49957: Calling all_plugins_play to load vars for managed_node3 22286 1726882783.49960: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882783.49964: Calling groups_plugins_play to load vars for managed_node3 22286 1726882783.50278: done sending task result for task 0affe814-3a2d-a75d-4836-000000000180 22286 1726882783.50282: WORKER PROCESS EXITING 22286 1726882783.50314: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882783.50639: done with get_vars() 22286 1726882783.50652: done getting variables 22286 1726882783.50716: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Friday 20 September 2024 21:39:43 -0400 (0:00:00.039) 0:00:06.900 ****** 22286 1726882783.50757: entering _queue_task() for managed_node3/package 22286 1726882783.51024: worker is 1 (out of 1 available) 22286 1726882783.51140: exiting _queue_task() for managed_node3/package 22286 1726882783.51152: done queuing things up, now waiting for results queue to drain 22286 1726882783.51154: waiting for pending results... 22286 1726882783.51332: running TaskExecutor() for managed_node3/TASK: Install iproute 22286 1726882783.51456: in run() - task 0affe814-3a2d-a75d-4836-000000000159 22286 1726882783.51481: variable 'ansible_search_path' from source: unknown 22286 1726882783.51495: variable 'ansible_search_path' from source: unknown 22286 1726882783.51541: calling self._execute() 22286 1726882783.51641: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882783.51654: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882783.51672: variable 'omit' from source: magic vars 22286 1726882783.52107: variable 'ansible_distribution_major_version' from source: facts 22286 1726882783.52126: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882783.52144: variable 'omit' from source: magic vars 22286 1726882783.52205: variable 'omit' from source: magic vars 22286 1726882783.52583: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22286 1726882783.56941: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22286 1726882783.57130: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22286 1726882783.57285: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22286 1726882783.57337: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22286 1726882783.57539: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22286 1726882783.57695: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22286 1726882783.57737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22286 1726882783.57813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882783.58040: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22286 1726882783.58043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22286 1726882783.58272: variable '__network_is_ostree' from source: set_fact 22286 1726882783.58288: variable 'omit' from source: magic vars 22286 1726882783.58538: variable 'omit' from source: magic vars 22286 1726882783.58542: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22286 1726882783.58547: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22286 1726882783.58572: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22286 1726882783.58675: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882783.58696: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882783.58738: variable 'inventory_hostname' from source: host vars for 'managed_node3' 22286 1726882783.58767: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882783.58869: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882783.59022: Set connection var ansible_shell_executable to /bin/sh 22286 1726882783.59099: Set connection var ansible_module_compression to ZIP_DEFLATED 22286 1726882783.59107: Set connection var ansible_connection to ssh 22286 1726882783.59113: Set connection var ansible_shell_type to sh 22286 1726882783.59123: Set connection var ansible_timeout to 10 22286 1726882783.59241: Set connection var ansible_pipelining to False 22286 1726882783.59248: variable 'ansible_shell_executable' from source: unknown 22286 1726882783.59257: variable 'ansible_connection' from source: unknown 22286 1726882783.59265: variable 'ansible_module_compression' from source: unknown 22286 1726882783.59273: variable 'ansible_shell_type' from source: unknown 22286 1726882783.59283: variable 'ansible_shell_executable' from source: unknown 22286 1726882783.59310: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882783.59319: variable 'ansible_pipelining' from source: unknown 22286 1726882783.59642: variable 'ansible_timeout' from source: unknown 22286 1726882783.59645: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882783.59689: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22286 1726882783.59708: variable 'omit' from source: magic vars 22286 1726882783.59720: starting attempt loop 22286 1726882783.59727: running the handler 22286 1726882783.59744: variable 'ansible_facts' from source: unknown 22286 1726882783.59840: variable 'ansible_facts' from source: unknown 22286 1726882783.59844: _low_level_execute_command(): starting 22286 1726882783.59846: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22286 1726882783.61340: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882783.61360: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882783.61450: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found <<< 22286 1726882783.61506: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882783.61649: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882783.61725: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882783.61882: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882783.63737: stdout chunk (state=3): >>>/root <<< 22286 1726882783.63896: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882783.64050: stderr chunk (state=3): >>><<< 22286 1726882783.64054: stdout chunk (state=3): >>><<< 22286 1726882783.64075: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882783.64311: _low_level_execute_command(): starting 22286 1726882783.64321: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882783.6409261-22572-220698788008928 `" && echo ansible-tmp-1726882783.6409261-22572-220698788008928="` echo /root/.ansible/tmp/ansible-tmp-1726882783.6409261-22572-220698788008928 `" ) && sleep 0' 22286 1726882783.65212: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882783.65228: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882783.65248: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882783.65321: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22286 1726882783.65326: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882783.65413: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882783.65544: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882783.65656: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882783.67935: stdout chunk (state=3): >>>ansible-tmp-1726882783.6409261-22572-220698788008928=/root/.ansible/tmp/ansible-tmp-1726882783.6409261-22572-220698788008928 <<< 22286 1726882783.67952: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882783.68420: stderr chunk (state=3): >>><<< 22286 1726882783.68424: stdout chunk (state=3): >>><<< 22286 1726882783.68426: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882783.6409261-22572-220698788008928=/root/.ansible/tmp/ansible-tmp-1726882783.6409261-22572-220698788008928 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882783.68429: variable 'ansible_module_compression' from source: unknown 22286 1726882783.68432: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 22286 1726882783.68438: ANSIBALLZ: Acquiring lock 22286 1726882783.68441: ANSIBALLZ: Lock acquired: 140212085117232 22286 1726882783.68443: ANSIBALLZ: Creating module 22286 1726882784.01300: ANSIBALLZ: Writing module into payload 22286 1726882784.01869: ANSIBALLZ: Writing module 22286 1726882784.01939: ANSIBALLZ: Renaming module 22286 1726882784.02027: ANSIBALLZ: Done creating module 22286 1726882784.02055: variable 'ansible_facts' from source: unknown 22286 1726882784.02290: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882783.6409261-22572-220698788008928/AnsiballZ_dnf.py 22286 1726882784.02658: Sending initial data 22286 1726882784.02858: Sent initial data (152 bytes) 22286 1726882784.04115: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 22286 1726882784.04118: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882784.04120: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882784.04261: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882784.06081: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 22286 1726882784.06181: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22286 1726882784.06294: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22286 1726882784.06411: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22286nvkga28q/tmpt2jb25mb /root/.ansible/tmp/ansible-tmp-1726882783.6409261-22572-220698788008928/AnsiballZ_dnf.py <<< 22286 1726882784.06422: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882783.6409261-22572-220698788008928/AnsiballZ_dnf.py" <<< 22286 1726882784.06565: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-22286nvkga28q/tmpt2jb25mb" to remote "/root/.ansible/tmp/ansible-tmp-1726882783.6409261-22572-220698788008928/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882783.6409261-22572-220698788008928/AnsiballZ_dnf.py" <<< 22286 1726882784.10153: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882784.10157: stdout chunk (state=3): >>><<< 22286 1726882784.10160: stderr chunk (state=3): >>><<< 22286 1726882784.10162: done transferring module to remote 22286 1726882784.10440: _low_level_execute_command(): starting 22286 1726882784.10445: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882783.6409261-22572-220698788008928/ /root/.ansible/tmp/ansible-tmp-1726882783.6409261-22572-220698788008928/AnsiballZ_dnf.py && sleep 0' 22286 1726882784.11855: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 22286 1726882784.12054: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882784.14094: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882784.14098: stdout chunk (state=3): >>><<< 22286 1726882784.14108: stderr chunk (state=3): >>><<< 22286 1726882784.14129: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882784.14133: _low_level_execute_command(): starting 22286 1726882784.14144: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882783.6409261-22572-220698788008928/AnsiballZ_dnf.py && sleep 0' 22286 1726882784.15359: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882784.15396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22286 1726882784.15416: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found <<< 22286 1726882784.15452: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882784.15630: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882784.15636: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882784.15771: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882784.15888: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882785.61888: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 22286 1726882785.66777: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. <<< 22286 1726882785.66824: stderr chunk (state=3): >>><<< 22286 1726882785.66828: stdout chunk (state=3): >>><<< 22286 1726882785.66851: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. 22286 1726882785.66902: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882783.6409261-22572-220698788008928/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22286 1726882785.66911: _low_level_execute_command(): starting 22286 1726882785.66914: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882783.6409261-22572-220698788008928/ > /dev/null 2>&1 && sleep 0' 22286 1726882785.67362: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882785.67366: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882785.67368: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882785.67371: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882785.67427: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882785.67430: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882785.67545: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882785.69564: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882785.69612: stderr chunk (state=3): >>><<< 22286 1726882785.69616: stdout chunk (state=3): >>><<< 22286 1726882785.69635: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882785.69644: handler run complete 22286 1726882785.69793: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22286 1726882785.69938: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22286 1726882785.69978: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22286 1726882785.70006: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22286 1726882785.70032: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22286 1726882785.70097: variable '__install_status' from source: unknown 22286 1726882785.70116: Evaluated conditional (__install_status is success): True 22286 1726882785.70132: attempt loop complete, returning result 22286 1726882785.70137: _execute() done 22286 1726882785.70142: dumping result to json 22286 1726882785.70149: done dumping result, returning 22286 1726882785.70156: done running TaskExecutor() for managed_node3/TASK: Install iproute [0affe814-3a2d-a75d-4836-000000000159] 22286 1726882785.70164: sending task result for task 0affe814-3a2d-a75d-4836-000000000159 22286 1726882785.70270: done sending task result for task 0affe814-3a2d-a75d-4836-000000000159 22286 1726882785.70274: WORKER PROCESS EXITING ok: [managed_node3] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 22286 1726882785.70390: no more pending results, returning what we have 22286 1726882785.70394: results queue empty 22286 1726882785.70395: checking for any_errors_fatal 22286 1726882785.70400: done checking for any_errors_fatal 22286 1726882785.70401: checking for max_fail_percentage 22286 1726882785.70403: done checking for max_fail_percentage 22286 1726882785.70404: checking to see if all hosts have failed and the running result is not ok 22286 1726882785.70405: done checking to see if all hosts have failed 22286 1726882785.70406: getting the remaining hosts for this loop 22286 1726882785.70408: done getting the remaining hosts for this loop 22286 1726882785.70413: getting the next task for host managed_node3 22286 1726882785.70419: done getting next task for host managed_node3 22286 1726882785.70422: ^ task is: TASK: Create veth interface {{ interface }} 22286 1726882785.70425: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882785.70429: getting variables 22286 1726882785.70430: in VariableManager get_vars() 22286 1726882785.70482: Calling all_inventory to load vars for managed_node3 22286 1726882785.70486: Calling groups_inventory to load vars for managed_node3 22286 1726882785.70488: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882785.70499: Calling all_plugins_play to load vars for managed_node3 22286 1726882785.70502: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882785.70506: Calling groups_plugins_play to load vars for managed_node3 22286 1726882785.70696: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882785.70885: done with get_vars() 22286 1726882785.70895: done getting variables 22286 1726882785.70945: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 22286 1726882785.71047: variable 'interface' from source: play vars TASK [Create veth interface veth0] ********************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Friday 20 September 2024 21:39:45 -0400 (0:00:02.203) 0:00:09.104 ****** 22286 1726882785.71086: entering _queue_task() for managed_node3/command 22286 1726882785.71301: worker is 1 (out of 1 available) 22286 1726882785.71312: exiting _queue_task() for managed_node3/command 22286 1726882785.71326: done queuing things up, now waiting for results queue to drain 22286 1726882785.71327: waiting for pending results... 22286 1726882785.71489: running TaskExecutor() for managed_node3/TASK: Create veth interface veth0 22286 1726882785.71566: in run() - task 0affe814-3a2d-a75d-4836-00000000015a 22286 1726882785.71579: variable 'ansible_search_path' from source: unknown 22286 1726882785.71583: variable 'ansible_search_path' from source: unknown 22286 1726882785.71797: variable 'interface' from source: play vars 22286 1726882785.71864: variable 'interface' from source: play vars 22286 1726882785.71929: variable 'interface' from source: play vars 22286 1726882785.72119: Loaded config def from plugin (lookup/items) 22286 1726882785.72126: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 22286 1726882785.72145: variable 'omit' from source: magic vars 22286 1726882785.72229: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882785.72240: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882785.72251: variable 'omit' from source: magic vars 22286 1726882785.72437: variable 'ansible_distribution_major_version' from source: facts 22286 1726882785.72442: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882785.72602: variable 'type' from source: play vars 22286 1726882785.72606: variable 'state' from source: include params 22286 1726882785.72611: variable 'interface' from source: play vars 22286 1726882785.72616: variable 'current_interfaces' from source: set_fact 22286 1726882785.72623: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 22286 1726882785.72630: variable 'omit' from source: magic vars 22286 1726882785.72668: variable 'omit' from source: magic vars 22286 1726882785.72704: variable 'item' from source: unknown 22286 1726882785.72765: variable 'item' from source: unknown 22286 1726882785.72781: variable 'omit' from source: magic vars 22286 1726882785.72806: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22286 1726882785.72831: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22286 1726882785.72848: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22286 1726882785.72867: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882785.72880: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882785.72905: variable 'inventory_hostname' from source: host vars for 'managed_node3' 22286 1726882785.72909: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882785.72913: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882785.73001: Set connection var ansible_shell_executable to /bin/sh 22286 1726882785.73010: Set connection var ansible_module_compression to ZIP_DEFLATED 22286 1726882785.73013: Set connection var ansible_connection to ssh 22286 1726882785.73015: Set connection var ansible_shell_type to sh 22286 1726882785.73022: Set connection var ansible_timeout to 10 22286 1726882785.73030: Set connection var ansible_pipelining to False 22286 1726882785.73048: variable 'ansible_shell_executable' from source: unknown 22286 1726882785.73051: variable 'ansible_connection' from source: unknown 22286 1726882785.73054: variable 'ansible_module_compression' from source: unknown 22286 1726882785.73059: variable 'ansible_shell_type' from source: unknown 22286 1726882785.73062: variable 'ansible_shell_executable' from source: unknown 22286 1726882785.73066: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882785.73072: variable 'ansible_pipelining' from source: unknown 22286 1726882785.73075: variable 'ansible_timeout' from source: unknown 22286 1726882785.73080: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882785.73189: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22286 1726882785.73198: variable 'omit' from source: magic vars 22286 1726882785.73202: starting attempt loop 22286 1726882785.73205: running the handler 22286 1726882785.73222: _low_level_execute_command(): starting 22286 1726882785.73230: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22286 1726882785.73740: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882785.73744: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882785.73770: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882785.73818: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882785.73822: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882785.73941: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882785.75657: stdout chunk (state=3): >>>/root <<< 22286 1726882785.75766: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882785.75817: stderr chunk (state=3): >>><<< 22286 1726882785.75820: stdout chunk (state=3): >>><<< 22286 1726882785.75842: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882785.75854: _low_level_execute_command(): starting 22286 1726882785.75861: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882785.7584264-22637-10364369067554 `" && echo ansible-tmp-1726882785.7584264-22637-10364369067554="` echo /root/.ansible/tmp/ansible-tmp-1726882785.7584264-22637-10364369067554 `" ) && sleep 0' 22286 1726882785.76307: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882785.76311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882785.76314: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882785.76316: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882785.76370: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882785.76373: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882785.76492: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882785.78530: stdout chunk (state=3): >>>ansible-tmp-1726882785.7584264-22637-10364369067554=/root/.ansible/tmp/ansible-tmp-1726882785.7584264-22637-10364369067554 <<< 22286 1726882785.78657: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882785.78703: stderr chunk (state=3): >>><<< 22286 1726882785.78707: stdout chunk (state=3): >>><<< 22286 1726882785.78723: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882785.7584264-22637-10364369067554=/root/.ansible/tmp/ansible-tmp-1726882785.7584264-22637-10364369067554 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882785.78749: variable 'ansible_module_compression' from source: unknown 22286 1726882785.78786: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22286nvkga28q/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 22286 1726882785.78814: variable 'ansible_facts' from source: unknown 22286 1726882785.78883: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882785.7584264-22637-10364369067554/AnsiballZ_command.py 22286 1726882785.78991: Sending initial data 22286 1726882785.78995: Sent initial data (155 bytes) 22286 1726882785.79417: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882785.79454: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22286 1726882785.79457: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found <<< 22286 1726882785.79460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 22286 1726882785.79464: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22286 1726882785.79466: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882785.79517: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882785.79523: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882785.79640: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882785.81321: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 22286 1726882785.81325: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22286 1726882785.81433: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22286 1726882785.81547: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22286nvkga28q/tmp9fyp5m_d /root/.ansible/tmp/ansible-tmp-1726882785.7584264-22637-10364369067554/AnsiballZ_command.py <<< 22286 1726882785.81555: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882785.7584264-22637-10364369067554/AnsiballZ_command.py" <<< 22286 1726882785.81661: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-22286nvkga28q/tmp9fyp5m_d" to remote "/root/.ansible/tmp/ansible-tmp-1726882785.7584264-22637-10364369067554/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882785.7584264-22637-10364369067554/AnsiballZ_command.py" <<< 22286 1726882785.82733: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882785.82788: stderr chunk (state=3): >>><<< 22286 1726882785.82791: stdout chunk (state=3): >>><<< 22286 1726882785.82809: done transferring module to remote 22286 1726882785.82822: _low_level_execute_command(): starting 22286 1726882785.82826: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882785.7584264-22637-10364369067554/ /root/.ansible/tmp/ansible-tmp-1726882785.7584264-22637-10364369067554/AnsiballZ_command.py && sleep 0' 22286 1726882785.83272: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882785.83279: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882785.83286: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration <<< 22286 1726882785.83288: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found <<< 22286 1726882785.83291: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882785.83329: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882785.83332: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882785.83451: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882785.85345: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882785.85386: stderr chunk (state=3): >>><<< 22286 1726882785.85389: stdout chunk (state=3): >>><<< 22286 1726882785.85404: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882785.85410: _low_level_execute_command(): starting 22286 1726882785.85413: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882785.7584264-22637-10364369067554/AnsiballZ_command.py && sleep 0' 22286 1726882785.85857: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882785.85860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882785.85863: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration <<< 22286 1726882785.85866: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882785.85869: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882785.85920: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882785.85924: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882785.86048: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882786.04179: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "veth0", "type", "veth", "peer", "name", "peerveth0"], "start": "2024-09-20 21:39:46.030578", "end": "2024-09-20 21:39:46.036981", "delta": "0:00:00.006403", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add veth0 type veth peer name peerveth0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 22286 1726882786.06896: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. <<< 22286 1726882786.06960: stderr chunk (state=3): >>><<< 22286 1726882786.06963: stdout chunk (state=3): >>><<< 22286 1726882786.06982: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "veth0", "type", "veth", "peer", "name", "peerveth0"], "start": "2024-09-20 21:39:46.030578", "end": "2024-09-20 21:39:46.036981", "delta": "0:00:00.006403", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add veth0 type veth peer name peerveth0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. 22286 1726882786.07023: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link add veth0 type veth peer name peerveth0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882785.7584264-22637-10364369067554/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22286 1726882786.07031: _low_level_execute_command(): starting 22286 1726882786.07039: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882785.7584264-22637-10364369067554/ > /dev/null 2>&1 && sleep 0' 22286 1726882786.07504: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882786.07513: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22286 1726882786.07543: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882786.07546: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882786.07549: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882786.07610: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882786.07613: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882786.07778: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882786.13495: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882786.13545: stderr chunk (state=3): >>><<< 22286 1726882786.13549: stdout chunk (state=3): >>><<< 22286 1726882786.13567: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882786.13573: handler run complete 22286 1726882786.13596: Evaluated conditional (False): False 22286 1726882786.13606: attempt loop complete, returning result 22286 1726882786.13623: variable 'item' from source: unknown 22286 1726882786.13698: variable 'item' from source: unknown ok: [managed_node3] => (item=ip link add veth0 type veth peer name peerveth0) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "add", "veth0", "type", "veth", "peer", "name", "peerveth0" ], "delta": "0:00:00.006403", "end": "2024-09-20 21:39:46.036981", "item": "ip link add veth0 type veth peer name peerveth0", "rc": 0, "start": "2024-09-20 21:39:46.030578" } 22286 1726882786.13890: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882786.13893: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882786.13896: variable 'omit' from source: magic vars 22286 1726882786.13996: variable 'ansible_distribution_major_version' from source: facts 22286 1726882786.14002: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882786.14161: variable 'type' from source: play vars 22286 1726882786.14164: variable 'state' from source: include params 22286 1726882786.14170: variable 'interface' from source: play vars 22286 1726882786.14175: variable 'current_interfaces' from source: set_fact 22286 1726882786.14183: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 22286 1726882786.14188: variable 'omit' from source: magic vars 22286 1726882786.14201: variable 'omit' from source: magic vars 22286 1726882786.14239: variable 'item' from source: unknown 22286 1726882786.14291: variable 'item' from source: unknown 22286 1726882786.14304: variable 'omit' from source: magic vars 22286 1726882786.14323: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22286 1726882786.14331: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882786.14340: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882786.14356: variable 'inventory_hostname' from source: host vars for 'managed_node3' 22286 1726882786.14359: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882786.14362: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882786.14427: Set connection var ansible_shell_executable to /bin/sh 22286 1726882786.14436: Set connection var ansible_module_compression to ZIP_DEFLATED 22286 1726882786.14439: Set connection var ansible_connection to ssh 22286 1726882786.14442: Set connection var ansible_shell_type to sh 22286 1726882786.14450: Set connection var ansible_timeout to 10 22286 1726882786.14463: Set connection var ansible_pipelining to False 22286 1726882786.14479: variable 'ansible_shell_executable' from source: unknown 22286 1726882786.14482: variable 'ansible_connection' from source: unknown 22286 1726882786.14486: variable 'ansible_module_compression' from source: unknown 22286 1726882786.14488: variable 'ansible_shell_type' from source: unknown 22286 1726882786.14490: variable 'ansible_shell_executable' from source: unknown 22286 1726882786.14495: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882786.14500: variable 'ansible_pipelining' from source: unknown 22286 1726882786.14503: variable 'ansible_timeout' from source: unknown 22286 1726882786.14508: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882786.14591: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22286 1726882786.14600: variable 'omit' from source: magic vars 22286 1726882786.14606: starting attempt loop 22286 1726882786.14608: running the handler 22286 1726882786.14616: _low_level_execute_command(): starting 22286 1726882786.14621: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22286 1726882786.15089: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882786.15093: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found <<< 22286 1726882786.15095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 22286 1726882786.15097: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882786.15105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882786.15157: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882786.15160: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882786.15281: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882786.17042: stdout chunk (state=3): >>>/root <<< 22286 1726882786.17150: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882786.17199: stderr chunk (state=3): >>><<< 22286 1726882786.17202: stdout chunk (state=3): >>><<< 22286 1726882786.17215: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882786.17224: _low_level_execute_command(): starting 22286 1726882786.17229: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882786.1721501-22637-27206186791567 `" && echo ansible-tmp-1726882786.1721501-22637-27206186791567="` echo /root/.ansible/tmp/ansible-tmp-1726882786.1721501-22637-27206186791567 `" ) && sleep 0' 22286 1726882786.17671: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882786.17675: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found <<< 22286 1726882786.17680: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration <<< 22286 1726882786.17682: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 <<< 22286 1726882786.17685: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882786.17739: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882786.17742: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882786.17856: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882786.19968: stdout chunk (state=3): >>>ansible-tmp-1726882786.1721501-22637-27206186791567=/root/.ansible/tmp/ansible-tmp-1726882786.1721501-22637-27206186791567 <<< 22286 1726882786.20086: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882786.20128: stderr chunk (state=3): >>><<< 22286 1726882786.20131: stdout chunk (state=3): >>><<< 22286 1726882786.20146: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882786.1721501-22637-27206186791567=/root/.ansible/tmp/ansible-tmp-1726882786.1721501-22637-27206186791567 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882786.20164: variable 'ansible_module_compression' from source: unknown 22286 1726882786.20198: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22286nvkga28q/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 22286 1726882786.20215: variable 'ansible_facts' from source: unknown 22286 1726882786.20262: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882786.1721501-22637-27206186791567/AnsiballZ_command.py 22286 1726882786.20357: Sending initial data 22286 1726882786.20361: Sent initial data (155 bytes) 22286 1726882786.20808: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882786.20811: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882786.20814: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882786.20816: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882786.20876: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882786.20881: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882786.20996: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882786.22883: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22286 1726882786.23006: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22286 1726882786.23127: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22286nvkga28q/tmp_l_19hlk /root/.ansible/tmp/ansible-tmp-1726882786.1721501-22637-27206186791567/AnsiballZ_command.py <<< 22286 1726882786.23131: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882786.1721501-22637-27206186791567/AnsiballZ_command.py" <<< 22286 1726882786.23232: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-22286nvkga28q/tmp_l_19hlk" to remote "/root/.ansible/tmp/ansible-tmp-1726882786.1721501-22637-27206186791567/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882786.1721501-22637-27206186791567/AnsiballZ_command.py" <<< 22286 1726882786.24840: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882786.24843: stdout chunk (state=3): >>><<< 22286 1726882786.24846: stderr chunk (state=3): >>><<< 22286 1726882786.24848: done transferring module to remote 22286 1726882786.24851: _low_level_execute_command(): starting 22286 1726882786.24853: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882786.1721501-22637-27206186791567/ /root/.ansible/tmp/ansible-tmp-1726882786.1721501-22637-27206186791567/AnsiballZ_command.py && sleep 0' 22286 1726882786.25387: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882786.25401: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882786.25414: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22286 1726882786.25424: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882786.25470: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882786.25491: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882786.25601: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882786.27608: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882786.27655: stderr chunk (state=3): >>><<< 22286 1726882786.27658: stdout chunk (state=3): >>><<< 22286 1726882786.27668: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882786.27671: _low_level_execute_command(): starting 22286 1726882786.27677: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882786.1721501-22637-27206186791567/AnsiballZ_command.py && sleep 0' 22286 1726882786.28097: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882786.28100: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882786.28103: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882786.28105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882786.28154: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882786.28160: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882786.28280: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882786.45802: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerveth0", "up"], "start": "2024-09-20 21:39:46.452271", "end": "2024-09-20 21:39:46.456115", "delta": "0:00:00.003844", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerveth0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 22286 1726882786.47544: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. <<< 22286 1726882786.47548: stdout chunk (state=3): >>><<< 22286 1726882786.47550: stderr chunk (state=3): >>><<< 22286 1726882786.47553: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerveth0", "up"], "start": "2024-09-20 21:39:46.452271", "end": "2024-09-20 21:39:46.456115", "delta": "0:00:00.003844", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerveth0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. 22286 1726882786.47570: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set peerveth0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882786.1721501-22637-27206186791567/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22286 1726882786.47587: _low_level_execute_command(): starting 22286 1726882786.47597: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882786.1721501-22637-27206186791567/ > /dev/null 2>&1 && sleep 0' 22286 1726882786.48291: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882786.48429: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882786.48432: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882786.48478: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882786.48585: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882786.50650: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882786.50653: stdout chunk (state=3): >>><<< 22286 1726882786.50656: stderr chunk (state=3): >>><<< 22286 1726882786.50674: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882786.50703: handler run complete 22286 1726882786.50759: Evaluated conditional (False): False 22286 1726882786.50844: attempt loop complete, returning result 22286 1726882786.50848: variable 'item' from source: unknown 22286 1726882786.50894: variable 'item' from source: unknown ok: [managed_node3] => (item=ip link set peerveth0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "peerveth0", "up" ], "delta": "0:00:00.003844", "end": "2024-09-20 21:39:46.456115", "item": "ip link set peerveth0 up", "rc": 0, "start": "2024-09-20 21:39:46.452271" } 22286 1726882786.51242: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882786.51246: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882786.51249: variable 'omit' from source: magic vars 22286 1726882786.51251: variable 'ansible_distribution_major_version' from source: facts 22286 1726882786.51253: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882786.51571: variable 'type' from source: play vars 22286 1726882786.51574: variable 'state' from source: include params 22286 1726882786.51580: variable 'interface' from source: play vars 22286 1726882786.51583: variable 'current_interfaces' from source: set_fact 22286 1726882786.51588: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 22286 1726882786.51593: variable 'omit' from source: magic vars 22286 1726882786.51596: variable 'omit' from source: magic vars 22286 1726882786.51598: variable 'item' from source: unknown 22286 1726882786.51626: variable 'item' from source: unknown 22286 1726882786.51647: variable 'omit' from source: magic vars 22286 1726882786.51679: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22286 1726882786.51683: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882786.51691: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882786.51709: variable 'inventory_hostname' from source: host vars for 'managed_node3' 22286 1726882786.51713: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882786.51717: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882786.51816: Set connection var ansible_shell_executable to /bin/sh 22286 1726882786.51899: Set connection var ansible_module_compression to ZIP_DEFLATED 22286 1726882786.51903: Set connection var ansible_connection to ssh 22286 1726882786.51905: Set connection var ansible_shell_type to sh 22286 1726882786.51908: Set connection var ansible_timeout to 10 22286 1726882786.51910: Set connection var ansible_pipelining to False 22286 1726882786.51912: variable 'ansible_shell_executable' from source: unknown 22286 1726882786.51919: variable 'ansible_connection' from source: unknown 22286 1726882786.51921: variable 'ansible_module_compression' from source: unknown 22286 1726882786.51923: variable 'ansible_shell_type' from source: unknown 22286 1726882786.51926: variable 'ansible_shell_executable' from source: unknown 22286 1726882786.51928: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882786.51930: variable 'ansible_pipelining' from source: unknown 22286 1726882786.51932: variable 'ansible_timeout' from source: unknown 22286 1726882786.51936: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882786.52021: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22286 1726882786.52030: variable 'omit' from source: magic vars 22286 1726882786.52037: starting attempt loop 22286 1726882786.52040: running the handler 22286 1726882786.52046: _low_level_execute_command(): starting 22286 1726882786.52053: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22286 1726882786.52661: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882786.52665: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882786.52668: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882786.52670: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22286 1726882786.52673: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 <<< 22286 1726882786.52678: stderr chunk (state=3): >>>debug2: match not found <<< 22286 1726882786.52686: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882786.52689: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 22286 1726882786.52699: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.41.238 is address <<< 22286 1726882786.52706: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 22286 1726882786.52715: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882786.52725: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882786.52883: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882786.52887: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882786.53088: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882786.54870: stdout chunk (state=3): >>>/root <<< 22286 1726882786.54952: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882786.55006: stderr chunk (state=3): >>><<< 22286 1726882786.55042: stdout chunk (state=3): >>><<< 22286 1726882786.55156: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882786.55169: _low_level_execute_command(): starting 22286 1726882786.55172: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882786.5515525-22637-85955534866057 `" && echo ansible-tmp-1726882786.5515525-22637-85955534866057="` echo /root/.ansible/tmp/ansible-tmp-1726882786.5515525-22637-85955534866057 `" ) && sleep 0' 22286 1726882786.56439: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882786.56443: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882786.56445: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882786.56447: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22286 1726882786.56450: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 <<< 22286 1726882786.56452: stderr chunk (state=3): >>>debug2: match not found <<< 22286 1726882786.56454: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882786.56630: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882786.56649: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882786.56789: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882786.58830: stdout chunk (state=3): >>>ansible-tmp-1726882786.5515525-22637-85955534866057=/root/.ansible/tmp/ansible-tmp-1726882786.5515525-22637-85955534866057 <<< 22286 1726882786.58955: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882786.59147: stderr chunk (state=3): >>><<< 22286 1726882786.59157: stdout chunk (state=3): >>><<< 22286 1726882786.59352: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882786.5515525-22637-85955534866057=/root/.ansible/tmp/ansible-tmp-1726882786.5515525-22637-85955534866057 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882786.59355: variable 'ansible_module_compression' from source: unknown 22286 1726882786.59358: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22286nvkga28q/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 22286 1726882786.59360: variable 'ansible_facts' from source: unknown 22286 1726882786.59529: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882786.5515525-22637-85955534866057/AnsiballZ_command.py 22286 1726882786.59994: Sending initial data 22286 1726882786.59997: Sent initial data (155 bytes) 22286 1726882786.61160: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882786.61185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882786.61350: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882786.61403: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882786.61491: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882786.63206: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22286 1726882786.63314: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22286 1726882786.63425: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22286nvkga28q/tmpsggnjsfc /root/.ansible/tmp/ansible-tmp-1726882786.5515525-22637-85955534866057/AnsiballZ_command.py <<< 22286 1726882786.63428: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882786.5515525-22637-85955534866057/AnsiballZ_command.py" <<< 22286 1726882786.63562: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-22286nvkga28q/tmpsggnjsfc" to remote "/root/.ansible/tmp/ansible-tmp-1726882786.5515525-22637-85955534866057/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882786.5515525-22637-85955534866057/AnsiballZ_command.py" <<< 22286 1726882786.65842: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882786.65979: stderr chunk (state=3): >>><<< 22286 1726882786.65982: stdout chunk (state=3): >>><<< 22286 1726882786.66004: done transferring module to remote 22286 1726882786.66185: _low_level_execute_command(): starting 22286 1726882786.66189: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882786.5515525-22637-85955534866057/ /root/.ansible/tmp/ansible-tmp-1726882786.5515525-22637-85955534866057/AnsiballZ_command.py && sleep 0' 22286 1726882786.67396: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882786.67402: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22286 1726882786.67408: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882786.67413: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration <<< 22286 1726882786.67424: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882786.67623: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882786.67851: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882786.67997: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882786.70001: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882786.70008: stdout chunk (state=3): >>><<< 22286 1726882786.70011: stderr chunk (state=3): >>><<< 22286 1726882786.70030: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882786.70033: _low_level_execute_command(): starting 22286 1726882786.70045: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882786.5515525-22637-85955534866057/AnsiballZ_command.py && sleep 0' 22286 1726882786.70657: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882786.70701: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882786.70704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882786.70707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22286 1726882786.70710: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 <<< 22286 1726882786.70712: stderr chunk (state=3): >>>debug2: match not found <<< 22286 1726882786.70717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882786.70733: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 22286 1726882786.70743: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.41.238 is address <<< 22286 1726882786.70809: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 22286 1726882786.70813: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882786.70815: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882786.70853: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882786.70863: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882786.70875: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882786.70895: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882786.71037: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882786.88701: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "veth0", "up"], "start": "2024-09-20 21:39:46.880532", "end": "2024-09-20 21:39:46.884788", "delta": "0:00:00.004256", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set veth0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 22286 1726882786.90641: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. <<< 22286 1726882786.90646: stdout chunk (state=3): >>><<< 22286 1726882786.90649: stderr chunk (state=3): >>><<< 22286 1726882786.90651: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "veth0", "up"], "start": "2024-09-20 21:39:46.880532", "end": "2024-09-20 21:39:46.884788", "delta": "0:00:00.004256", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set veth0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. 22286 1726882786.90654: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set veth0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882786.5515525-22637-85955534866057/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22286 1726882786.90656: _low_level_execute_command(): starting 22286 1726882786.90658: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882786.5515525-22637-85955534866057/ > /dev/null 2>&1 && sleep 0' 22286 1726882786.91226: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882786.91237: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882786.91250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882786.91268: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22286 1726882786.91314: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882786.91393: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882786.91407: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882786.91436: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882786.91570: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882786.93739: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882786.93743: stdout chunk (state=3): >>><<< 22286 1726882786.93745: stderr chunk (state=3): >>><<< 22286 1726882786.93748: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882786.93750: handler run complete 22286 1726882786.93753: Evaluated conditional (False): False 22286 1726882786.93755: attempt loop complete, returning result 22286 1726882786.93756: variable 'item' from source: unknown 22286 1726882786.93825: variable 'item' from source: unknown ok: [managed_node3] => (item=ip link set veth0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "veth0", "up" ], "delta": "0:00:00.004256", "end": "2024-09-20 21:39:46.884788", "item": "ip link set veth0 up", "rc": 0, "start": "2024-09-20 21:39:46.880532" } 22286 1726882786.94250: dumping result to json 22286 1726882786.94253: done dumping result, returning 22286 1726882786.94258: done running TaskExecutor() for managed_node3/TASK: Create veth interface veth0 [0affe814-3a2d-a75d-4836-00000000015a] 22286 1726882786.94260: sending task result for task 0affe814-3a2d-a75d-4836-00000000015a 22286 1726882786.94315: done sending task result for task 0affe814-3a2d-a75d-4836-00000000015a 22286 1726882786.94318: WORKER PROCESS EXITING 22286 1726882786.94451: no more pending results, returning what we have 22286 1726882786.94455: results queue empty 22286 1726882786.94456: checking for any_errors_fatal 22286 1726882786.94460: done checking for any_errors_fatal 22286 1726882786.94462: checking for max_fail_percentage 22286 1726882786.94464: done checking for max_fail_percentage 22286 1726882786.94465: checking to see if all hosts have failed and the running result is not ok 22286 1726882786.94466: done checking to see if all hosts have failed 22286 1726882786.94467: getting the remaining hosts for this loop 22286 1726882786.94468: done getting the remaining hosts for this loop 22286 1726882786.94472: getting the next task for host managed_node3 22286 1726882786.94478: done getting next task for host managed_node3 22286 1726882786.94480: ^ task is: TASK: Set up veth as managed by NetworkManager 22286 1726882786.94484: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882786.94488: getting variables 22286 1726882786.94490: in VariableManager get_vars() 22286 1726882786.94524: Calling all_inventory to load vars for managed_node3 22286 1726882786.94528: Calling groups_inventory to load vars for managed_node3 22286 1726882786.94531: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882786.94549: Calling all_plugins_play to load vars for managed_node3 22286 1726882786.94553: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882786.94557: Calling groups_plugins_play to load vars for managed_node3 22286 1726882786.94841: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882786.95284: done with get_vars() 22286 1726882786.95296: done getting variables 22286 1726882786.95532: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Friday 20 September 2024 21:39:46 -0400 (0:00:01.244) 0:00:10.349 ****** 22286 1726882786.95566: entering _queue_task() for managed_node3/command 22286 1726882786.95827: worker is 1 (out of 1 available) 22286 1726882786.96040: exiting _queue_task() for managed_node3/command 22286 1726882786.96051: done queuing things up, now waiting for results queue to drain 22286 1726882786.96053: waiting for pending results... 22286 1726882786.96183: running TaskExecutor() for managed_node3/TASK: Set up veth as managed by NetworkManager 22286 1726882786.96232: in run() - task 0affe814-3a2d-a75d-4836-00000000015b 22286 1726882786.96255: variable 'ansible_search_path' from source: unknown 22286 1726882786.96263: variable 'ansible_search_path' from source: unknown 22286 1726882786.96313: calling self._execute() 22286 1726882786.96413: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882786.96427: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882786.96448: variable 'omit' from source: magic vars 22286 1726882786.96878: variable 'ansible_distribution_major_version' from source: facts 22286 1726882786.96932: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882786.97113: variable 'type' from source: play vars 22286 1726882786.97124: variable 'state' from source: include params 22286 1726882786.97136: Evaluated conditional (type == 'veth' and state == 'present'): True 22286 1726882786.97153: variable 'omit' from source: magic vars 22286 1726882786.97205: variable 'omit' from source: magic vars 22286 1726882786.97369: variable 'interface' from source: play vars 22286 1726882786.97372: variable 'omit' from source: magic vars 22286 1726882786.97413: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22286 1726882786.97459: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22286 1726882786.97494: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22286 1726882786.97557: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882786.97585: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882786.97695: variable 'inventory_hostname' from source: host vars for 'managed_node3' 22286 1726882786.97698: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882786.97700: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882786.97778: Set connection var ansible_shell_executable to /bin/sh 22286 1726882786.97795: Set connection var ansible_module_compression to ZIP_DEFLATED 22286 1726882786.97807: Set connection var ansible_connection to ssh 22286 1726882786.97814: Set connection var ansible_shell_type to sh 22286 1726882786.97826: Set connection var ansible_timeout to 10 22286 1726882786.97913: Set connection var ansible_pipelining to False 22286 1726882786.97916: variable 'ansible_shell_executable' from source: unknown 22286 1726882786.97918: variable 'ansible_connection' from source: unknown 22286 1726882786.97920: variable 'ansible_module_compression' from source: unknown 22286 1726882786.97922: variable 'ansible_shell_type' from source: unknown 22286 1726882786.97925: variable 'ansible_shell_executable' from source: unknown 22286 1726882786.97927: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882786.97929: variable 'ansible_pipelining' from source: unknown 22286 1726882786.97931: variable 'ansible_timeout' from source: unknown 22286 1726882786.97933: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882786.98091: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22286 1726882786.98108: variable 'omit' from source: magic vars 22286 1726882786.98118: starting attempt loop 22286 1726882786.98129: running the handler 22286 1726882786.98152: _low_level_execute_command(): starting 22286 1726882786.98166: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22286 1726882786.99020: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882786.99042: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882786.99064: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882786.99084: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882786.99231: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882787.00991: stdout chunk (state=3): >>>/root <<< 22286 1726882787.01171: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882787.01192: stdout chunk (state=3): >>><<< 22286 1726882787.01242: stderr chunk (state=3): >>><<< 22286 1726882787.01246: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882787.01256: _low_level_execute_command(): starting 22286 1726882787.01268: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882787.0122797-22679-77151977705455 `" && echo ansible-tmp-1726882787.0122797-22679-77151977705455="` echo /root/.ansible/tmp/ansible-tmp-1726882787.0122797-22679-77151977705455 `" ) && sleep 0' 22286 1726882787.02140: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882787.02198: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882787.02228: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882787.02256: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882787.02406: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882787.04494: stdout chunk (state=3): >>>ansible-tmp-1726882787.0122797-22679-77151977705455=/root/.ansible/tmp/ansible-tmp-1726882787.0122797-22679-77151977705455 <<< 22286 1726882787.04630: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882787.04667: stderr chunk (state=3): >>><<< 22286 1726882787.04671: stdout chunk (state=3): >>><<< 22286 1726882787.04694: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882787.0122797-22679-77151977705455=/root/.ansible/tmp/ansible-tmp-1726882787.0122797-22679-77151977705455 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882787.04738: variable 'ansible_module_compression' from source: unknown 22286 1726882787.04795: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22286nvkga28q/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 22286 1726882787.04833: variable 'ansible_facts' from source: unknown 22286 1726882787.04960: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882787.0122797-22679-77151977705455/AnsiballZ_command.py 22286 1726882787.05158: Sending initial data 22286 1726882787.05161: Sent initial data (155 bytes) 22286 1726882787.05771: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22286 1726882787.05781: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882787.05833: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882787.05878: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882787.05882: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882787.06002: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882787.07668: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22286 1726882787.07783: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22286 1726882787.07900: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22286nvkga28q/tmp_otvhblz /root/.ansible/tmp/ansible-tmp-1726882787.0122797-22679-77151977705455/AnsiballZ_command.py <<< 22286 1726882787.07904: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882787.0122797-22679-77151977705455/AnsiballZ_command.py" <<< 22286 1726882787.08072: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-22286nvkga28q/tmp_otvhblz" to remote "/root/.ansible/tmp/ansible-tmp-1726882787.0122797-22679-77151977705455/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882787.0122797-22679-77151977705455/AnsiballZ_command.py" <<< 22286 1726882787.09495: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882787.09550: stderr chunk (state=3): >>><<< 22286 1726882787.09558: stdout chunk (state=3): >>><<< 22286 1726882787.09574: done transferring module to remote 22286 1726882787.09584: _low_level_execute_command(): starting 22286 1726882787.09589: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882787.0122797-22679-77151977705455/ /root/.ansible/tmp/ansible-tmp-1726882787.0122797-22679-77151977705455/AnsiballZ_command.py && sleep 0' 22286 1726882787.10001: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882787.10005: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882787.10008: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882787.10063: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882787.10070: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882787.10186: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882787.12250: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882787.12253: stderr chunk (state=3): >>><<< 22286 1726882787.12255: stdout chunk (state=3): >>><<< 22286 1726882787.12258: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882787.12265: _low_level_execute_command(): starting 22286 1726882787.12267: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882787.0122797-22679-77151977705455/AnsiballZ_command.py && sleep 0' 22286 1726882787.12733: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882787.12742: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22286 1726882787.12769: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882787.12772: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882787.12774: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882787.12840: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882787.12850: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882787.12966: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882787.32240: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "veth0", "managed", "true"], "start": "2024-09-20 21:39:47.298927", "end": "2024-09-20 21:39:47.320464", "delta": "0:00:00.021537", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set veth0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 22286 1726882787.33858: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. <<< 22286 1726882787.33920: stderr chunk (state=3): >>><<< 22286 1726882787.33923: stdout chunk (state=3): >>><<< 22286 1726882787.33942: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "veth0", "managed", "true"], "start": "2024-09-20 21:39:47.298927", "end": "2024-09-20 21:39:47.320464", "delta": "0:00:00.021537", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set veth0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. 22286 1726882787.33978: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli d set veth0 managed true', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882787.0122797-22679-77151977705455/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22286 1726882787.33989: _low_level_execute_command(): starting 22286 1726882787.33997: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882787.0122797-22679-77151977705455/ > /dev/null 2>&1 && sleep 0' 22286 1726882787.34475: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882787.34481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882787.34484: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882787.34487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882787.34537: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882787.34545: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882787.34663: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882787.36651: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882787.36696: stderr chunk (state=3): >>><<< 22286 1726882787.36699: stdout chunk (state=3): >>><<< 22286 1726882787.36713: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882787.36721: handler run complete 22286 1726882787.36747: Evaluated conditional (False): False 22286 1726882787.36757: attempt loop complete, returning result 22286 1726882787.36760: _execute() done 22286 1726882787.36763: dumping result to json 22286 1726882787.36770: done dumping result, returning 22286 1726882787.36781: done running TaskExecutor() for managed_node3/TASK: Set up veth as managed by NetworkManager [0affe814-3a2d-a75d-4836-00000000015b] 22286 1726882787.36788: sending task result for task 0affe814-3a2d-a75d-4836-00000000015b 22286 1726882787.36894: done sending task result for task 0affe814-3a2d-a75d-4836-00000000015b 22286 1726882787.36897: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "nmcli", "d", "set", "veth0", "managed", "true" ], "delta": "0:00:00.021537", "end": "2024-09-20 21:39:47.320464", "rc": 0, "start": "2024-09-20 21:39:47.298927" } 22286 1726882787.36976: no more pending results, returning what we have 22286 1726882787.36979: results queue empty 22286 1726882787.36980: checking for any_errors_fatal 22286 1726882787.36993: done checking for any_errors_fatal 22286 1726882787.36993: checking for max_fail_percentage 22286 1726882787.36996: done checking for max_fail_percentage 22286 1726882787.36996: checking to see if all hosts have failed and the running result is not ok 22286 1726882787.36998: done checking to see if all hosts have failed 22286 1726882787.36998: getting the remaining hosts for this loop 22286 1726882787.37000: done getting the remaining hosts for this loop 22286 1726882787.37005: getting the next task for host managed_node3 22286 1726882787.37011: done getting next task for host managed_node3 22286 1726882787.37014: ^ task is: TASK: Delete veth interface {{ interface }} 22286 1726882787.37017: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882787.37021: getting variables 22286 1726882787.37022: in VariableManager get_vars() 22286 1726882787.37065: Calling all_inventory to load vars for managed_node3 22286 1726882787.37068: Calling groups_inventory to load vars for managed_node3 22286 1726882787.37071: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882787.37082: Calling all_plugins_play to load vars for managed_node3 22286 1726882787.37085: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882787.37088: Calling groups_plugins_play to load vars for managed_node3 22286 1726882787.37288: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882787.37472: done with get_vars() 22286 1726882787.37482: done getting variables 22286 1726882787.37529: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 22286 1726882787.37626: variable 'interface' from source: play vars TASK [Delete veth interface veth0] ********************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Friday 20 September 2024 21:39:47 -0400 (0:00:00.420) 0:00:10.769 ****** 22286 1726882787.37652: entering _queue_task() for managed_node3/command 22286 1726882787.37857: worker is 1 (out of 1 available) 22286 1726882787.37868: exiting _queue_task() for managed_node3/command 22286 1726882787.37883: done queuing things up, now waiting for results queue to drain 22286 1726882787.37885: waiting for pending results... 22286 1726882787.38052: running TaskExecutor() for managed_node3/TASK: Delete veth interface veth0 22286 1726882787.38132: in run() - task 0affe814-3a2d-a75d-4836-00000000015c 22286 1726882787.38146: variable 'ansible_search_path' from source: unknown 22286 1726882787.38149: variable 'ansible_search_path' from source: unknown 22286 1726882787.38184: calling self._execute() 22286 1726882787.38256: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882787.38263: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882787.38273: variable 'omit' from source: magic vars 22286 1726882787.38571: variable 'ansible_distribution_major_version' from source: facts 22286 1726882787.38584: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882787.38752: variable 'type' from source: play vars 22286 1726882787.38756: variable 'state' from source: include params 22286 1726882787.38763: variable 'interface' from source: play vars 22286 1726882787.38766: variable 'current_interfaces' from source: set_fact 22286 1726882787.38775: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): False 22286 1726882787.38778: when evaluation is False, skipping this task 22286 1726882787.38785: _execute() done 22286 1726882787.38787: dumping result to json 22286 1726882787.38795: done dumping result, returning 22286 1726882787.38798: done running TaskExecutor() for managed_node3/TASK: Delete veth interface veth0 [0affe814-3a2d-a75d-4836-00000000015c] 22286 1726882787.38805: sending task result for task 0affe814-3a2d-a75d-4836-00000000015c 22286 1726882787.38897: done sending task result for task 0affe814-3a2d-a75d-4836-00000000015c 22286 1726882787.38901: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'veth' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 22286 1726882787.38953: no more pending results, returning what we have 22286 1726882787.38956: results queue empty 22286 1726882787.38957: checking for any_errors_fatal 22286 1726882787.38963: done checking for any_errors_fatal 22286 1726882787.38964: checking for max_fail_percentage 22286 1726882787.38966: done checking for max_fail_percentage 22286 1726882787.38967: checking to see if all hosts have failed and the running result is not ok 22286 1726882787.38968: done checking to see if all hosts have failed 22286 1726882787.38969: getting the remaining hosts for this loop 22286 1726882787.38970: done getting the remaining hosts for this loop 22286 1726882787.38974: getting the next task for host managed_node3 22286 1726882787.38979: done getting next task for host managed_node3 22286 1726882787.38982: ^ task is: TASK: Create dummy interface {{ interface }} 22286 1726882787.38985: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882787.38989: getting variables 22286 1726882787.38990: in VariableManager get_vars() 22286 1726882787.39026: Calling all_inventory to load vars for managed_node3 22286 1726882787.39029: Calling groups_inventory to load vars for managed_node3 22286 1726882787.39032: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882787.39042: Calling all_plugins_play to load vars for managed_node3 22286 1726882787.39044: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882787.39047: Calling groups_plugins_play to load vars for managed_node3 22286 1726882787.39194: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882787.39377: done with get_vars() 22286 1726882787.39386: done getting variables 22286 1726882787.39429: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 22286 1726882787.39516: variable 'interface' from source: play vars TASK [Create dummy interface veth0] ******************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Friday 20 September 2024 21:39:47 -0400 (0:00:00.018) 0:00:10.788 ****** 22286 1726882787.39541: entering _queue_task() for managed_node3/command 22286 1726882787.39726: worker is 1 (out of 1 available) 22286 1726882787.39741: exiting _queue_task() for managed_node3/command 22286 1726882787.39754: done queuing things up, now waiting for results queue to drain 22286 1726882787.39755: waiting for pending results... 22286 1726882787.39912: running TaskExecutor() for managed_node3/TASK: Create dummy interface veth0 22286 1726882787.39989: in run() - task 0affe814-3a2d-a75d-4836-00000000015d 22286 1726882787.40002: variable 'ansible_search_path' from source: unknown 22286 1726882787.40007: variable 'ansible_search_path' from source: unknown 22286 1726882787.40038: calling self._execute() 22286 1726882787.40109: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882787.40116: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882787.40126: variable 'omit' from source: magic vars 22286 1726882787.40447: variable 'ansible_distribution_major_version' from source: facts 22286 1726882787.40453: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882787.40621: variable 'type' from source: play vars 22286 1726882787.40625: variable 'state' from source: include params 22286 1726882787.40632: variable 'interface' from source: play vars 22286 1726882787.40636: variable 'current_interfaces' from source: set_fact 22286 1726882787.40648: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 22286 1726882787.40651: when evaluation is False, skipping this task 22286 1726882787.40655: _execute() done 22286 1726882787.40658: dumping result to json 22286 1726882787.40660: done dumping result, returning 22286 1726882787.40740: done running TaskExecutor() for managed_node3/TASK: Create dummy interface veth0 [0affe814-3a2d-a75d-4836-00000000015d] 22286 1726882787.40743: sending task result for task 0affe814-3a2d-a75d-4836-00000000015d 22286 1726882787.40809: done sending task result for task 0affe814-3a2d-a75d-4836-00000000015d 22286 1726882787.40816: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 22286 1726882787.40858: no more pending results, returning what we have 22286 1726882787.40860: results queue empty 22286 1726882787.40861: checking for any_errors_fatal 22286 1726882787.40865: done checking for any_errors_fatal 22286 1726882787.40865: checking for max_fail_percentage 22286 1726882787.40867: done checking for max_fail_percentage 22286 1726882787.40867: checking to see if all hosts have failed and the running result is not ok 22286 1726882787.40868: done checking to see if all hosts have failed 22286 1726882787.40868: getting the remaining hosts for this loop 22286 1726882787.40869: done getting the remaining hosts for this loop 22286 1726882787.40872: getting the next task for host managed_node3 22286 1726882787.40877: done getting next task for host managed_node3 22286 1726882787.40879: ^ task is: TASK: Delete dummy interface {{ interface }} 22286 1726882787.40881: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882787.40885: getting variables 22286 1726882787.40886: in VariableManager get_vars() 22286 1726882787.40913: Calling all_inventory to load vars for managed_node3 22286 1726882787.40915: Calling groups_inventory to load vars for managed_node3 22286 1726882787.40917: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882787.40924: Calling all_plugins_play to load vars for managed_node3 22286 1726882787.40926: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882787.40928: Calling groups_plugins_play to load vars for managed_node3 22286 1726882787.41109: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882787.41288: done with get_vars() 22286 1726882787.41297: done getting variables 22286 1726882787.41344: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 22286 1726882787.41424: variable 'interface' from source: play vars TASK [Delete dummy interface veth0] ******************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Friday 20 September 2024 21:39:47 -0400 (0:00:00.019) 0:00:10.807 ****** 22286 1726882787.41448: entering _queue_task() for managed_node3/command 22286 1726882787.41629: worker is 1 (out of 1 available) 22286 1726882787.41645: exiting _queue_task() for managed_node3/command 22286 1726882787.41658: done queuing things up, now waiting for results queue to drain 22286 1726882787.41660: waiting for pending results... 22286 1726882787.41810: running TaskExecutor() for managed_node3/TASK: Delete dummy interface veth0 22286 1726882787.41892: in run() - task 0affe814-3a2d-a75d-4836-00000000015e 22286 1726882787.41896: variable 'ansible_search_path' from source: unknown 22286 1726882787.41903: variable 'ansible_search_path' from source: unknown 22286 1726882787.41933: calling self._execute() 22286 1726882787.42001: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882787.42011: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882787.42020: variable 'omit' from source: magic vars 22286 1726882787.42301: variable 'ansible_distribution_major_version' from source: facts 22286 1726882787.42311: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882787.42477: variable 'type' from source: play vars 22286 1726882787.42485: variable 'state' from source: include params 22286 1726882787.42490: variable 'interface' from source: play vars 22286 1726882787.42495: variable 'current_interfaces' from source: set_fact 22286 1726882787.42503: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 22286 1726882787.42506: when evaluation is False, skipping this task 22286 1726882787.42509: _execute() done 22286 1726882787.42513: dumping result to json 22286 1726882787.42518: done dumping result, returning 22286 1726882787.42524: done running TaskExecutor() for managed_node3/TASK: Delete dummy interface veth0 [0affe814-3a2d-a75d-4836-00000000015e] 22286 1726882787.42530: sending task result for task 0affe814-3a2d-a75d-4836-00000000015e 22286 1726882787.42621: done sending task result for task 0affe814-3a2d-a75d-4836-00000000015e 22286 1726882787.42624: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 22286 1726882787.42689: no more pending results, returning what we have 22286 1726882787.42692: results queue empty 22286 1726882787.42693: checking for any_errors_fatal 22286 1726882787.42698: done checking for any_errors_fatal 22286 1726882787.42699: checking for max_fail_percentage 22286 1726882787.42700: done checking for max_fail_percentage 22286 1726882787.42701: checking to see if all hosts have failed and the running result is not ok 22286 1726882787.42702: done checking to see if all hosts have failed 22286 1726882787.42703: getting the remaining hosts for this loop 22286 1726882787.42705: done getting the remaining hosts for this loop 22286 1726882787.42709: getting the next task for host managed_node3 22286 1726882787.42714: done getting next task for host managed_node3 22286 1726882787.42717: ^ task is: TASK: Create tap interface {{ interface }} 22286 1726882787.42721: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882787.42724: getting variables 22286 1726882787.42725: in VariableManager get_vars() 22286 1726882787.42757: Calling all_inventory to load vars for managed_node3 22286 1726882787.42759: Calling groups_inventory to load vars for managed_node3 22286 1726882787.42761: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882787.42769: Calling all_plugins_play to load vars for managed_node3 22286 1726882787.42771: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882787.42773: Calling groups_plugins_play to load vars for managed_node3 22286 1726882787.42921: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882787.43100: done with get_vars() 22286 1726882787.43108: done getting variables 22286 1726882787.43155: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 22286 1726882787.43238: variable 'interface' from source: play vars TASK [Create tap interface veth0] ********************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Friday 20 September 2024 21:39:47 -0400 (0:00:00.018) 0:00:10.826 ****** 22286 1726882787.43260: entering _queue_task() for managed_node3/command 22286 1726882787.43442: worker is 1 (out of 1 available) 22286 1726882787.43456: exiting _queue_task() for managed_node3/command 22286 1726882787.43470: done queuing things up, now waiting for results queue to drain 22286 1726882787.43471: waiting for pending results... 22286 1726882787.43622: running TaskExecutor() for managed_node3/TASK: Create tap interface veth0 22286 1726882787.43706: in run() - task 0affe814-3a2d-a75d-4836-00000000015f 22286 1726882787.43721: variable 'ansible_search_path' from source: unknown 22286 1726882787.43724: variable 'ansible_search_path' from source: unknown 22286 1726882787.43760: calling self._execute() 22286 1726882787.43827: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882787.43840: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882787.43851: variable 'omit' from source: magic vars 22286 1726882787.44183: variable 'ansible_distribution_major_version' from source: facts 22286 1726882787.44193: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882787.44358: variable 'type' from source: play vars 22286 1726882787.44362: variable 'state' from source: include params 22286 1726882787.44371: variable 'interface' from source: play vars 22286 1726882787.44374: variable 'current_interfaces' from source: set_fact 22286 1726882787.44384: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 22286 1726882787.44387: when evaluation is False, skipping this task 22286 1726882787.44389: _execute() done 22286 1726882787.44392: dumping result to json 22286 1726882787.44397: done dumping result, returning 22286 1726882787.44403: done running TaskExecutor() for managed_node3/TASK: Create tap interface veth0 [0affe814-3a2d-a75d-4836-00000000015f] 22286 1726882787.44409: sending task result for task 0affe814-3a2d-a75d-4836-00000000015f 22286 1726882787.44498: done sending task result for task 0affe814-3a2d-a75d-4836-00000000015f 22286 1726882787.44501: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 22286 1726882787.44550: no more pending results, returning what we have 22286 1726882787.44553: results queue empty 22286 1726882787.44554: checking for any_errors_fatal 22286 1726882787.44559: done checking for any_errors_fatal 22286 1726882787.44560: checking for max_fail_percentage 22286 1726882787.44562: done checking for max_fail_percentage 22286 1726882787.44563: checking to see if all hosts have failed and the running result is not ok 22286 1726882787.44564: done checking to see if all hosts have failed 22286 1726882787.44565: getting the remaining hosts for this loop 22286 1726882787.44566: done getting the remaining hosts for this loop 22286 1726882787.44570: getting the next task for host managed_node3 22286 1726882787.44575: done getting next task for host managed_node3 22286 1726882787.44577: ^ task is: TASK: Delete tap interface {{ interface }} 22286 1726882787.44580: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882787.44584: getting variables 22286 1726882787.44585: in VariableManager get_vars() 22286 1726882787.44621: Calling all_inventory to load vars for managed_node3 22286 1726882787.44623: Calling groups_inventory to load vars for managed_node3 22286 1726882787.44625: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882787.44632: Calling all_plugins_play to load vars for managed_node3 22286 1726882787.44637: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882787.44640: Calling groups_plugins_play to load vars for managed_node3 22286 1726882787.44813: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882787.44990: done with get_vars() 22286 1726882787.44998: done getting variables 22286 1726882787.45041: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 22286 1726882787.45118: variable 'interface' from source: play vars TASK [Delete tap interface veth0] ********************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Friday 20 September 2024 21:39:47 -0400 (0:00:00.018) 0:00:10.844 ****** 22286 1726882787.45141: entering _queue_task() for managed_node3/command 22286 1726882787.45317: worker is 1 (out of 1 available) 22286 1726882787.45331: exiting _queue_task() for managed_node3/command 22286 1726882787.45348: done queuing things up, now waiting for results queue to drain 22286 1726882787.45350: waiting for pending results... 22286 1726882787.45495: running TaskExecutor() for managed_node3/TASK: Delete tap interface veth0 22286 1726882787.45559: in run() - task 0affe814-3a2d-a75d-4836-000000000160 22286 1726882787.45572: variable 'ansible_search_path' from source: unknown 22286 1726882787.45576: variable 'ansible_search_path' from source: unknown 22286 1726882787.45609: calling self._execute() 22286 1726882787.45675: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882787.45683: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882787.45697: variable 'omit' from source: magic vars 22286 1726882787.45971: variable 'ansible_distribution_major_version' from source: facts 22286 1726882787.45985: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882787.46148: variable 'type' from source: play vars 22286 1726882787.46152: variable 'state' from source: include params 22286 1726882787.46158: variable 'interface' from source: play vars 22286 1726882787.46163: variable 'current_interfaces' from source: set_fact 22286 1726882787.46171: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 22286 1726882787.46173: when evaluation is False, skipping this task 22286 1726882787.46176: _execute() done 22286 1726882787.46183: dumping result to json 22286 1726882787.46186: done dumping result, returning 22286 1726882787.46193: done running TaskExecutor() for managed_node3/TASK: Delete tap interface veth0 [0affe814-3a2d-a75d-4836-000000000160] 22286 1726882787.46199: sending task result for task 0affe814-3a2d-a75d-4836-000000000160 22286 1726882787.46282: done sending task result for task 0affe814-3a2d-a75d-4836-000000000160 22286 1726882787.46285: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 22286 1726882787.46338: no more pending results, returning what we have 22286 1726882787.46341: results queue empty 22286 1726882787.46343: checking for any_errors_fatal 22286 1726882787.46347: done checking for any_errors_fatal 22286 1726882787.46348: checking for max_fail_percentage 22286 1726882787.46350: done checking for max_fail_percentage 22286 1726882787.46351: checking to see if all hosts have failed and the running result is not ok 22286 1726882787.46352: done checking to see if all hosts have failed 22286 1726882787.46353: getting the remaining hosts for this loop 22286 1726882787.46354: done getting the remaining hosts for this loop 22286 1726882787.46358: getting the next task for host managed_node3 22286 1726882787.46366: done getting next task for host managed_node3 22286 1726882787.46368: ^ task is: TASK: Set up gateway ip on veth peer 22286 1726882787.46370: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882787.46375: getting variables 22286 1726882787.46376: in VariableManager get_vars() 22286 1726882787.46412: Calling all_inventory to load vars for managed_node3 22286 1726882787.46414: Calling groups_inventory to load vars for managed_node3 22286 1726882787.46416: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882787.46424: Calling all_plugins_play to load vars for managed_node3 22286 1726882787.46426: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882787.46428: Calling groups_plugins_play to load vars for managed_node3 22286 1726882787.46580: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882787.46775: done with get_vars() 22286 1726882787.46784: done getting variables 22286 1726882787.46856: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set up gateway ip on veth peer] ****************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:15 Friday 20 September 2024 21:39:47 -0400 (0:00:00.017) 0:00:10.862 ****** 22286 1726882787.46879: entering _queue_task() for managed_node3/shell 22286 1726882787.46880: Creating lock for shell 22286 1726882787.47057: worker is 1 (out of 1 available) 22286 1726882787.47069: exiting _queue_task() for managed_node3/shell 22286 1726882787.47083: done queuing things up, now waiting for results queue to drain 22286 1726882787.47085: waiting for pending results... 22286 1726882787.47245: running TaskExecutor() for managed_node3/TASK: Set up gateway ip on veth peer 22286 1726882787.47310: in run() - task 0affe814-3a2d-a75d-4836-00000000000d 22286 1726882787.47324: variable 'ansible_search_path' from source: unknown 22286 1726882787.47359: calling self._execute() 22286 1726882787.47426: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882787.47442: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882787.47457: variable 'omit' from source: magic vars 22286 1726882787.47750: variable 'ansible_distribution_major_version' from source: facts 22286 1726882787.47762: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882787.47770: variable 'omit' from source: magic vars 22286 1726882787.47797: variable 'omit' from source: magic vars 22286 1726882787.47905: variable 'interface' from source: play vars 22286 1726882787.47919: variable 'omit' from source: magic vars 22286 1726882787.47955: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22286 1726882787.47987: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22286 1726882787.48006: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22286 1726882787.48023: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882787.48035: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882787.48063: variable 'inventory_hostname' from source: host vars for 'managed_node3' 22286 1726882787.48067: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882787.48071: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882787.48158: Set connection var ansible_shell_executable to /bin/sh 22286 1726882787.48166: Set connection var ansible_module_compression to ZIP_DEFLATED 22286 1726882787.48170: Set connection var ansible_connection to ssh 22286 1726882787.48172: Set connection var ansible_shell_type to sh 22286 1726882787.48197: Set connection var ansible_timeout to 10 22286 1726882787.48200: Set connection var ansible_pipelining to False 22286 1726882787.48209: variable 'ansible_shell_executable' from source: unknown 22286 1726882787.48212: variable 'ansible_connection' from source: unknown 22286 1726882787.48215: variable 'ansible_module_compression' from source: unknown 22286 1726882787.48221: variable 'ansible_shell_type' from source: unknown 22286 1726882787.48224: variable 'ansible_shell_executable' from source: unknown 22286 1726882787.48226: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882787.48232: variable 'ansible_pipelining' from source: unknown 22286 1726882787.48236: variable 'ansible_timeout' from source: unknown 22286 1726882787.48241: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882787.48359: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22286 1726882787.48368: variable 'omit' from source: magic vars 22286 1726882787.48374: starting attempt loop 22286 1726882787.48379: running the handler 22286 1726882787.48388: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22286 1726882787.48404: _low_level_execute_command(): starting 22286 1726882787.48414: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22286 1726882787.48955: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882787.48959: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882787.48962: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882787.48964: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882787.49082: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882787.49214: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882787.50965: stdout chunk (state=3): >>>/root <<< 22286 1726882787.51081: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882787.51123: stderr chunk (state=3): >>><<< 22286 1726882787.51126: stdout chunk (state=3): >>><<< 22286 1726882787.51257: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882787.51263: _low_level_execute_command(): starting 22286 1726882787.51267: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882787.511647-22707-167625905518480 `" && echo ansible-tmp-1726882787.511647-22707-167625905518480="` echo /root/.ansible/tmp/ansible-tmp-1726882787.511647-22707-167625905518480 `" ) && sleep 0' 22286 1726882787.51829: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882787.51849: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882787.51991: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration <<< 22286 1726882787.52004: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882787.52047: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882787.52160: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882787.54499: stdout chunk (state=3): >>>ansible-tmp-1726882787.511647-22707-167625905518480=/root/.ansible/tmp/ansible-tmp-1726882787.511647-22707-167625905518480 <<< 22286 1726882787.54713: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882787.54716: stdout chunk (state=3): >>><<< 22286 1726882787.54717: stderr chunk (state=3): >>><<< 22286 1726882787.54800: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882787.511647-22707-167625905518480=/root/.ansible/tmp/ansible-tmp-1726882787.511647-22707-167625905518480 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882787.54804: variable 'ansible_module_compression' from source: unknown 22286 1726882787.54807: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22286nvkga28q/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 22286 1726882787.54847: variable 'ansible_facts' from source: unknown 22286 1726882787.54910: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882787.511647-22707-167625905518480/AnsiballZ_command.py 22286 1726882787.55018: Sending initial data 22286 1726882787.55021: Sent initial data (155 bytes) 22286 1726882787.55471: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882787.55475: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found <<< 22286 1726882787.55480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882787.55484: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found <<< 22286 1726882787.55487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882787.55545: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882787.55548: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882787.55661: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882787.57407: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22286 1726882787.57531: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22286 1726882787.57649: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22286nvkga28q/tmp_m6gg7sh /root/.ansible/tmp/ansible-tmp-1726882787.511647-22707-167625905518480/AnsiballZ_command.py <<< 22286 1726882787.57652: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882787.511647-22707-167625905518480/AnsiballZ_command.py" <<< 22286 1726882787.57749: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-22286nvkga28q/tmp_m6gg7sh" to remote "/root/.ansible/tmp/ansible-tmp-1726882787.511647-22707-167625905518480/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882787.511647-22707-167625905518480/AnsiballZ_command.py" <<< 22286 1726882787.59212: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882787.59242: stderr chunk (state=3): >>><<< 22286 1726882787.59261: stdout chunk (state=3): >>><<< 22286 1726882787.59291: done transferring module to remote 22286 1726882787.59387: _low_level_execute_command(): starting 22286 1726882787.59390: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882787.511647-22707-167625905518480/ /root/.ansible/tmp/ansible-tmp-1726882787.511647-22707-167625905518480/AnsiballZ_command.py && sleep 0' 22286 1726882787.59939: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882787.59953: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882787.59968: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882787.60002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22286 1726882787.60052: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882787.60135: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882787.60178: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882787.60321: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882787.62275: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882787.62296: stderr chunk (state=3): >>><<< 22286 1726882787.62305: stdout chunk (state=3): >>><<< 22286 1726882787.62324: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882787.62332: _low_level_execute_command(): starting 22286 1726882787.62344: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882787.511647-22707-167625905518480/AnsiballZ_command.py && sleep 0' 22286 1726882787.62983: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882787.63007: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882787.63118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882787.63142: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882787.63159: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882787.63185: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882787.63347: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882787.83108: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": "ip netns add ns1\nip link set peerveth0 netns ns1\nip netns exec ns1 ip -6 addr add 2001:db8::1/32 dev peerveth0\nip netns exec ns1 ip link set peerveth0 up\n", "start": "2024-09-20 21:39:47.804619", "end": "2024-09-20 21:39:47.828715", "delta": "0:00:00.024096", "msg": "", "invocation": {"module_args": {"_raw_params": "ip netns add ns1\nip link set peerveth0 netns ns1\nip netns exec ns1 ip -6 addr add 2001:db8::1/32 dev peerveth0\nip netns exec ns1 ip link set peerveth0 up\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 22286 1726882787.84866: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. <<< 22286 1726882787.84910: stderr chunk (state=3): >>><<< 22286 1726882787.84927: stdout chunk (state=3): >>><<< 22286 1726882787.84959: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": "ip netns add ns1\nip link set peerveth0 netns ns1\nip netns exec ns1 ip -6 addr add 2001:db8::1/32 dev peerveth0\nip netns exec ns1 ip link set peerveth0 up\n", "start": "2024-09-20 21:39:47.804619", "end": "2024-09-20 21:39:47.828715", "delta": "0:00:00.024096", "msg": "", "invocation": {"module_args": {"_raw_params": "ip netns add ns1\nip link set peerveth0 netns ns1\nip netns exec ns1 ip -6 addr add 2001:db8::1/32 dev peerveth0\nip netns exec ns1 ip link set peerveth0 up\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. 22286 1726882787.85016: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip netns add ns1\nip link set peerveth0 netns ns1\nip netns exec ns1 ip -6 addr add 2001:db8::1/32 dev peerveth0\nip netns exec ns1 ip link set peerveth0 up\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882787.511647-22707-167625905518480/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22286 1726882787.85033: _low_level_execute_command(): starting 22286 1726882787.85046: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882787.511647-22707-167625905518480/ > /dev/null 2>&1 && sleep 0' 22286 1726882787.85833: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882787.85855: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882787.85891: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882787.86037: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882787.88169: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882787.88173: stdout chunk (state=3): >>><<< 22286 1726882787.88176: stderr chunk (state=3): >>><<< 22286 1726882787.88198: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882787.88339: handler run complete 22286 1726882787.88342: Evaluated conditional (False): False 22286 1726882787.88345: attempt loop complete, returning result 22286 1726882787.88347: _execute() done 22286 1726882787.88350: dumping result to json 22286 1726882787.88352: done dumping result, returning 22286 1726882787.88354: done running TaskExecutor() for managed_node3/TASK: Set up gateway ip on veth peer [0affe814-3a2d-a75d-4836-00000000000d] 22286 1726882787.88356: sending task result for task 0affe814-3a2d-a75d-4836-00000000000d 22286 1726882787.88438: done sending task result for task 0affe814-3a2d-a75d-4836-00000000000d 22286 1726882787.88441: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "ip netns add ns1\nip link set peerveth0 netns ns1\nip netns exec ns1 ip -6 addr add 2001:db8::1/32 dev peerveth0\nip netns exec ns1 ip link set peerveth0 up\n", "delta": "0:00:00.024096", "end": "2024-09-20 21:39:47.828715", "rc": 0, "start": "2024-09-20 21:39:47.804619" } 22286 1726882787.88526: no more pending results, returning what we have 22286 1726882787.88531: results queue empty 22286 1726882787.88532: checking for any_errors_fatal 22286 1726882787.88545: done checking for any_errors_fatal 22286 1726882787.88546: checking for max_fail_percentage 22286 1726882787.88549: done checking for max_fail_percentage 22286 1726882787.88550: checking to see if all hosts have failed and the running result is not ok 22286 1726882787.88556: done checking to see if all hosts have failed 22286 1726882787.88557: getting the remaining hosts for this loop 22286 1726882787.88559: done getting the remaining hosts for this loop 22286 1726882787.88565: getting the next task for host managed_node3 22286 1726882787.88573: done getting next task for host managed_node3 22286 1726882787.88577: ^ task is: TASK: TEST: I can configure an interface with static ipv6 config 22286 1726882787.88580: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882787.88584: getting variables 22286 1726882787.88586: in VariableManager get_vars() 22286 1726882787.88633: Calling all_inventory to load vars for managed_node3 22286 1726882787.88845: Calling groups_inventory to load vars for managed_node3 22286 1726882787.88849: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882787.88862: Calling all_plugins_play to load vars for managed_node3 22286 1726882787.88865: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882787.88869: Calling groups_plugins_play to load vars for managed_node3 22286 1726882787.89262: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882787.89609: done with get_vars() 22286 1726882787.89623: done getting variables 22286 1726882787.89692: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [TEST: I can configure an interface with static ipv6 config] ************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:27 Friday 20 September 2024 21:39:47 -0400 (0:00:00.428) 0:00:11.290 ****** 22286 1726882787.89732: entering _queue_task() for managed_node3/debug 22286 1726882787.90015: worker is 1 (out of 1 available) 22286 1726882787.90026: exiting _queue_task() for managed_node3/debug 22286 1726882787.90144: done queuing things up, now waiting for results queue to drain 22286 1726882787.90153: waiting for pending results... 22286 1726882787.90497: running TaskExecutor() for managed_node3/TASK: TEST: I can configure an interface with static ipv6 config 22286 1726882787.90503: in run() - task 0affe814-3a2d-a75d-4836-00000000000f 22286 1726882787.90506: variable 'ansible_search_path' from source: unknown 22286 1726882787.90589: calling self._execute() 22286 1726882787.90664: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882787.90678: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882787.90704: variable 'omit' from source: magic vars 22286 1726882787.91460: variable 'ansible_distribution_major_version' from source: facts 22286 1726882787.91463: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882787.91465: variable 'omit' from source: magic vars 22286 1726882787.91531: variable 'omit' from source: magic vars 22286 1726882787.91534: variable 'omit' from source: magic vars 22286 1726882787.91586: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22286 1726882787.91631: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22286 1726882787.92097: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22286 1726882787.92133: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882787.92154: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882787.92200: variable 'inventory_hostname' from source: host vars for 'managed_node3' 22286 1726882787.92209: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882787.92228: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882787.92381: Set connection var ansible_shell_executable to /bin/sh 22286 1726882787.92405: Set connection var ansible_module_compression to ZIP_DEFLATED 22286 1726882787.92441: Set connection var ansible_connection to ssh 22286 1726882787.92444: Set connection var ansible_shell_type to sh 22286 1726882787.92447: Set connection var ansible_timeout to 10 22286 1726882787.92449: Set connection var ansible_pipelining to False 22286 1726882787.92479: variable 'ansible_shell_executable' from source: unknown 22286 1726882787.92489: variable 'ansible_connection' from source: unknown 22286 1726882787.92497: variable 'ansible_module_compression' from source: unknown 22286 1726882787.92513: variable 'ansible_shell_type' from source: unknown 22286 1726882787.92516: variable 'ansible_shell_executable' from source: unknown 22286 1726882787.92537: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882787.92540: variable 'ansible_pipelining' from source: unknown 22286 1726882787.92543: variable 'ansible_timeout' from source: unknown 22286 1726882787.92554: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882787.92771: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22286 1726882787.92775: variable 'omit' from source: magic vars 22286 1726882787.92777: starting attempt loop 22286 1726882787.92780: running the handler 22286 1726882787.92837: handler run complete 22286 1726882787.92936: attempt loop complete, returning result 22286 1726882787.92941: _execute() done 22286 1726882787.92944: dumping result to json 22286 1726882787.92946: done dumping result, returning 22286 1726882787.92955: done running TaskExecutor() for managed_node3/TASK: TEST: I can configure an interface with static ipv6 config [0affe814-3a2d-a75d-4836-00000000000f] 22286 1726882787.92957: sending task result for task 0affe814-3a2d-a75d-4836-00000000000f ok: [managed_node3] => {} MSG: ################################################## 22286 1726882787.93111: no more pending results, returning what we have 22286 1726882787.93114: results queue empty 22286 1726882787.93115: checking for any_errors_fatal 22286 1726882787.93123: done checking for any_errors_fatal 22286 1726882787.93124: checking for max_fail_percentage 22286 1726882787.93126: done checking for max_fail_percentage 22286 1726882787.93127: checking to see if all hosts have failed and the running result is not ok 22286 1726882787.93128: done checking to see if all hosts have failed 22286 1726882787.93128: getting the remaining hosts for this loop 22286 1726882787.93130: done getting the remaining hosts for this loop 22286 1726882787.93136: getting the next task for host managed_node3 22286 1726882787.93144: done getting next task for host managed_node3 22286 1726882787.93152: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 22286 1726882787.93156: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882787.93174: getting variables 22286 1726882787.93176: in VariableManager get_vars() 22286 1726882787.93220: Calling all_inventory to load vars for managed_node3 22286 1726882787.93223: Calling groups_inventory to load vars for managed_node3 22286 1726882787.93226: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882787.93355: Calling all_plugins_play to load vars for managed_node3 22286 1726882787.93360: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882787.93366: done sending task result for task 0affe814-3a2d-a75d-4836-00000000000f 22286 1726882787.93368: WORKER PROCESS EXITING 22286 1726882787.93373: Calling groups_plugins_play to load vars for managed_node3 22286 1726882787.94015: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882787.94298: done with get_vars() 22286 1726882787.94309: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:39:47 -0400 (0:00:00.046) 0:00:11.337 ****** 22286 1726882787.94416: entering _queue_task() for managed_node3/include_tasks 22286 1726882787.94696: worker is 1 (out of 1 available) 22286 1726882787.94709: exiting _queue_task() for managed_node3/include_tasks 22286 1726882787.94722: done queuing things up, now waiting for results queue to drain 22286 1726882787.94724: waiting for pending results... 22286 1726882787.94987: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 22286 1726882787.95155: in run() - task 0affe814-3a2d-a75d-4836-000000000017 22286 1726882787.95178: variable 'ansible_search_path' from source: unknown 22286 1726882787.95188: variable 'ansible_search_path' from source: unknown 22286 1726882787.95243: calling self._execute() 22286 1726882787.95352: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882787.95368: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882787.95388: variable 'omit' from source: magic vars 22286 1726882787.95838: variable 'ansible_distribution_major_version' from source: facts 22286 1726882787.95860: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882787.95882: _execute() done 22286 1726882787.95939: dumping result to json 22286 1726882787.95942: done dumping result, returning 22286 1726882787.95945: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affe814-3a2d-a75d-4836-000000000017] 22286 1726882787.95948: sending task result for task 0affe814-3a2d-a75d-4836-000000000017 22286 1726882787.96183: no more pending results, returning what we have 22286 1726882787.96189: in VariableManager get_vars() 22286 1726882787.96254: Calling all_inventory to load vars for managed_node3 22286 1726882787.96258: Calling groups_inventory to load vars for managed_node3 22286 1726882787.96263: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882787.96280: Calling all_plugins_play to load vars for managed_node3 22286 1726882787.96284: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882787.96289: Calling groups_plugins_play to load vars for managed_node3 22286 1726882787.96703: done sending task result for task 0affe814-3a2d-a75d-4836-000000000017 22286 1726882787.96706: WORKER PROCESS EXITING 22286 1726882787.96735: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882787.97087: done with get_vars() 22286 1726882787.97101: variable 'ansible_search_path' from source: unknown 22286 1726882787.97102: variable 'ansible_search_path' from source: unknown 22286 1726882787.97151: we have included files to process 22286 1726882787.97152: generating all_blocks data 22286 1726882787.97155: done generating all_blocks data 22286 1726882787.97160: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 22286 1726882787.97161: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 22286 1726882787.97164: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 22286 1726882787.98103: done processing included file 22286 1726882787.98105: iterating over new_blocks loaded from include file 22286 1726882787.98107: in VariableManager get_vars() 22286 1726882787.98140: done with get_vars() 22286 1726882787.98142: filtering new block on tags 22286 1726882787.98165: done filtering new block on tags 22286 1726882787.98168: in VariableManager get_vars() 22286 1726882787.98203: done with get_vars() 22286 1726882787.98205: filtering new block on tags 22286 1726882787.98233: done filtering new block on tags 22286 1726882787.98238: in VariableManager get_vars() 22286 1726882787.98267: done with get_vars() 22286 1726882787.98270: filtering new block on tags 22286 1726882787.98298: done filtering new block on tags 22286 1726882787.98301: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 22286 1726882787.98307: extending task lists for all hosts with included blocks 22286 1726882787.99491: done extending task lists 22286 1726882787.99493: done processing included files 22286 1726882787.99494: results queue empty 22286 1726882787.99495: checking for any_errors_fatal 22286 1726882787.99499: done checking for any_errors_fatal 22286 1726882787.99500: checking for max_fail_percentage 22286 1726882787.99501: done checking for max_fail_percentage 22286 1726882787.99502: checking to see if all hosts have failed and the running result is not ok 22286 1726882787.99503: done checking to see if all hosts have failed 22286 1726882787.99504: getting the remaining hosts for this loop 22286 1726882787.99506: done getting the remaining hosts for this loop 22286 1726882787.99509: getting the next task for host managed_node3 22286 1726882787.99514: done getting next task for host managed_node3 22286 1726882787.99517: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 22286 1726882787.99521: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882787.99532: getting variables 22286 1726882787.99533: in VariableManager get_vars() 22286 1726882787.99552: Calling all_inventory to load vars for managed_node3 22286 1726882787.99555: Calling groups_inventory to load vars for managed_node3 22286 1726882787.99558: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882787.99564: Calling all_plugins_play to load vars for managed_node3 22286 1726882787.99567: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882787.99571: Calling groups_plugins_play to load vars for managed_node3 22286 1726882787.99823: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882788.00171: done with get_vars() 22286 1726882788.00182: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:39:48 -0400 (0:00:00.058) 0:00:11.396 ****** 22286 1726882788.00270: entering _queue_task() for managed_node3/setup 22286 1726882788.00542: worker is 1 (out of 1 available) 22286 1726882788.00555: exiting _queue_task() for managed_node3/setup 22286 1726882788.00568: done queuing things up, now waiting for results queue to drain 22286 1726882788.00569: waiting for pending results... 22286 1726882788.00848: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 22286 1726882788.01030: in run() - task 0affe814-3a2d-a75d-4836-0000000001fc 22286 1726882788.01055: variable 'ansible_search_path' from source: unknown 22286 1726882788.01064: variable 'ansible_search_path' from source: unknown 22286 1726882788.01113: calling self._execute() 22286 1726882788.01214: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882788.01227: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882788.01245: variable 'omit' from source: magic vars 22286 1726882788.01676: variable 'ansible_distribution_major_version' from source: facts 22286 1726882788.01699: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882788.01957: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22286 1726882788.04592: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22286 1726882788.04684: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22286 1726882788.04731: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22286 1726882788.04791: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22286 1726882788.04828: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22286 1726882788.04972: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22286 1726882788.04998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22286 1726882788.05031: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882788.05094: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22286 1726882788.05114: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22286 1726882788.05190: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22286 1726882788.05440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22286 1726882788.05444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882788.05447: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22286 1726882788.05449: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22286 1726882788.05526: variable '__network_required_facts' from source: role '' defaults 22286 1726882788.05543: variable 'ansible_facts' from source: unknown 22286 1726882788.05681: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 22286 1726882788.05691: when evaluation is False, skipping this task 22286 1726882788.05698: _execute() done 22286 1726882788.05704: dumping result to json 22286 1726882788.05711: done dumping result, returning 22286 1726882788.05722: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affe814-3a2d-a75d-4836-0000000001fc] 22286 1726882788.05730: sending task result for task 0affe814-3a2d-a75d-4836-0000000001fc skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 22286 1726882788.05945: no more pending results, returning what we have 22286 1726882788.05949: results queue empty 22286 1726882788.05950: checking for any_errors_fatal 22286 1726882788.05952: done checking for any_errors_fatal 22286 1726882788.05953: checking for max_fail_percentage 22286 1726882788.05955: done checking for max_fail_percentage 22286 1726882788.05956: checking to see if all hosts have failed and the running result is not ok 22286 1726882788.05957: done checking to see if all hosts have failed 22286 1726882788.05958: getting the remaining hosts for this loop 22286 1726882788.05960: done getting the remaining hosts for this loop 22286 1726882788.05965: getting the next task for host managed_node3 22286 1726882788.05976: done getting next task for host managed_node3 22286 1726882788.05980: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 22286 1726882788.05985: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882788.06007: getting variables 22286 1726882788.06009: in VariableManager get_vars() 22286 1726882788.06060: Calling all_inventory to load vars for managed_node3 22286 1726882788.06064: Calling groups_inventory to load vars for managed_node3 22286 1726882788.06067: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882788.06079: Calling all_plugins_play to load vars for managed_node3 22286 1726882788.06083: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882788.06088: Calling groups_plugins_play to load vars for managed_node3 22286 1726882788.06589: done sending task result for task 0affe814-3a2d-a75d-4836-0000000001fc 22286 1726882788.06593: WORKER PROCESS EXITING 22286 1726882788.06621: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882788.07000: done with get_vars() 22286 1726882788.07013: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:39:48 -0400 (0:00:00.068) 0:00:11.464 ****** 22286 1726882788.07137: entering _queue_task() for managed_node3/stat 22286 1726882788.07524: worker is 1 (out of 1 available) 22286 1726882788.07538: exiting _queue_task() for managed_node3/stat 22286 1726882788.07551: done queuing things up, now waiting for results queue to drain 22286 1726882788.07552: waiting for pending results... 22286 1726882788.07741: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 22286 1726882788.07929: in run() - task 0affe814-3a2d-a75d-4836-0000000001fe 22286 1726882788.07958: variable 'ansible_search_path' from source: unknown 22286 1726882788.07968: variable 'ansible_search_path' from source: unknown 22286 1726882788.08017: calling self._execute() 22286 1726882788.08119: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882788.08133: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882788.08153: variable 'omit' from source: magic vars 22286 1726882788.08591: variable 'ansible_distribution_major_version' from source: facts 22286 1726882788.08618: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882788.08839: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22286 1726882788.09165: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22286 1726882788.09220: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22286 1726882788.09272: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22286 1726882788.09319: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22286 1726882788.09488: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22286 1726882788.09492: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22286 1726882788.09497: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882788.09538: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22286 1726882788.09644: variable '__network_is_ostree' from source: set_fact 22286 1726882788.09657: Evaluated conditional (not __network_is_ostree is defined): False 22286 1726882788.09666: when evaluation is False, skipping this task 22286 1726882788.09673: _execute() done 22286 1726882788.09680: dumping result to json 22286 1726882788.09691: done dumping result, returning 22286 1726882788.09708: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affe814-3a2d-a75d-4836-0000000001fe] 22286 1726882788.09718: sending task result for task 0affe814-3a2d-a75d-4836-0000000001fe skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 22286 1726882788.09860: no more pending results, returning what we have 22286 1726882788.09864: results queue empty 22286 1726882788.09865: checking for any_errors_fatal 22286 1726882788.09871: done checking for any_errors_fatal 22286 1726882788.09872: checking for max_fail_percentage 22286 1726882788.09875: done checking for max_fail_percentage 22286 1726882788.09875: checking to see if all hosts have failed and the running result is not ok 22286 1726882788.09876: done checking to see if all hosts have failed 22286 1726882788.09877: getting the remaining hosts for this loop 22286 1726882788.09879: done getting the remaining hosts for this loop 22286 1726882788.09884: getting the next task for host managed_node3 22286 1726882788.09890: done getting next task for host managed_node3 22286 1726882788.09894: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 22286 1726882788.09899: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882788.09913: getting variables 22286 1726882788.09915: in VariableManager get_vars() 22286 1726882788.09962: Calling all_inventory to load vars for managed_node3 22286 1726882788.09965: Calling groups_inventory to load vars for managed_node3 22286 1726882788.09968: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882788.09979: Calling all_plugins_play to load vars for managed_node3 22286 1726882788.09983: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882788.09987: Calling groups_plugins_play to load vars for managed_node3 22286 1726882788.10464: done sending task result for task 0affe814-3a2d-a75d-4836-0000000001fe 22286 1726882788.10467: WORKER PROCESS EXITING 22286 1726882788.10499: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882788.10848: done with get_vars() 22286 1726882788.10860: done getting variables 22286 1726882788.10927: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:39:48 -0400 (0:00:00.038) 0:00:11.503 ****** 22286 1726882788.10968: entering _queue_task() for managed_node3/set_fact 22286 1726882788.11208: worker is 1 (out of 1 available) 22286 1726882788.11220: exiting _queue_task() for managed_node3/set_fact 22286 1726882788.11232: done queuing things up, now waiting for results queue to drain 22286 1726882788.11337: waiting for pending results... 22286 1726882788.11509: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 22286 1726882788.11690: in run() - task 0affe814-3a2d-a75d-4836-0000000001ff 22286 1726882788.11711: variable 'ansible_search_path' from source: unknown 22286 1726882788.11720: variable 'ansible_search_path' from source: unknown 22286 1726882788.11764: calling self._execute() 22286 1726882788.11860: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882788.11874: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882788.11898: variable 'omit' from source: magic vars 22286 1726882788.12316: variable 'ansible_distribution_major_version' from source: facts 22286 1726882788.12344: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882788.12553: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22286 1726882788.12943: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22286 1726882788.13039: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22286 1726882788.13047: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22286 1726882788.13097: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22286 1726882788.13197: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22286 1726882788.13237: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22286 1726882788.13273: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882788.13339: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22286 1726882788.13428: variable '__network_is_ostree' from source: set_fact 22286 1726882788.13443: Evaluated conditional (not __network_is_ostree is defined): False 22286 1726882788.13520: when evaluation is False, skipping this task 22286 1726882788.13523: _execute() done 22286 1726882788.13531: dumping result to json 22286 1726882788.13533: done dumping result, returning 22286 1726882788.13538: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affe814-3a2d-a75d-4836-0000000001ff] 22286 1726882788.13540: sending task result for task 0affe814-3a2d-a75d-4836-0000000001ff 22286 1726882788.13604: done sending task result for task 0affe814-3a2d-a75d-4836-0000000001ff 22286 1726882788.13607: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 22286 1726882788.13673: no more pending results, returning what we have 22286 1726882788.13677: results queue empty 22286 1726882788.13678: checking for any_errors_fatal 22286 1726882788.13684: done checking for any_errors_fatal 22286 1726882788.13686: checking for max_fail_percentage 22286 1726882788.13688: done checking for max_fail_percentage 22286 1726882788.13689: checking to see if all hosts have failed and the running result is not ok 22286 1726882788.13690: done checking to see if all hosts have failed 22286 1726882788.13691: getting the remaining hosts for this loop 22286 1726882788.13693: done getting the remaining hosts for this loop 22286 1726882788.13698: getting the next task for host managed_node3 22286 1726882788.13708: done getting next task for host managed_node3 22286 1726882788.13712: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 22286 1726882788.13717: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882788.13733: getting variables 22286 1726882788.13740: in VariableManager get_vars() 22286 1726882788.13784: Calling all_inventory to load vars for managed_node3 22286 1726882788.13787: Calling groups_inventory to load vars for managed_node3 22286 1726882788.13790: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882788.13801: Calling all_plugins_play to load vars for managed_node3 22286 1726882788.13805: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882788.13809: Calling groups_plugins_play to load vars for managed_node3 22286 1726882788.14301: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882788.14649: done with get_vars() 22286 1726882788.14661: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:39:48 -0400 (0:00:00.037) 0:00:11.541 ****** 22286 1726882788.14767: entering _queue_task() for managed_node3/service_facts 22286 1726882788.14770: Creating lock for service_facts 22286 1726882788.15013: worker is 1 (out of 1 available) 22286 1726882788.15025: exiting _queue_task() for managed_node3/service_facts 22286 1726882788.15152: done queuing things up, now waiting for results queue to drain 22286 1726882788.15155: waiting for pending results... 22286 1726882788.15353: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 22286 1726882788.15589: in run() - task 0affe814-3a2d-a75d-4836-000000000201 22286 1726882788.15594: variable 'ansible_search_path' from source: unknown 22286 1726882788.15598: variable 'ansible_search_path' from source: unknown 22286 1726882788.15601: calling self._execute() 22286 1726882788.15658: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882788.15672: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882788.15696: variable 'omit' from source: magic vars 22286 1726882788.16121: variable 'ansible_distribution_major_version' from source: facts 22286 1726882788.16146: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882788.16163: variable 'omit' from source: magic vars 22286 1726882788.16269: variable 'omit' from source: magic vars 22286 1726882788.16319: variable 'omit' from source: magic vars 22286 1726882788.16375: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22286 1726882788.16426: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22286 1726882788.16463: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22286 1726882788.16640: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882788.16645: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882788.16647: variable 'inventory_hostname' from source: host vars for 'managed_node3' 22286 1726882788.16650: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882788.16653: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882788.16714: Set connection var ansible_shell_executable to /bin/sh 22286 1726882788.16730: Set connection var ansible_module_compression to ZIP_DEFLATED 22286 1726882788.16741: Set connection var ansible_connection to ssh 22286 1726882788.16750: Set connection var ansible_shell_type to sh 22286 1726882788.16765: Set connection var ansible_timeout to 10 22286 1726882788.16787: Set connection var ansible_pipelining to False 22286 1726882788.16822: variable 'ansible_shell_executable' from source: unknown 22286 1726882788.16832: variable 'ansible_connection' from source: unknown 22286 1726882788.16844: variable 'ansible_module_compression' from source: unknown 22286 1726882788.16883: variable 'ansible_shell_type' from source: unknown 22286 1726882788.16887: variable 'ansible_shell_executable' from source: unknown 22286 1726882788.16890: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882788.16892: variable 'ansible_pipelining' from source: unknown 22286 1726882788.16894: variable 'ansible_timeout' from source: unknown 22286 1726882788.16897: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882788.17137: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 22286 1726882788.17211: variable 'omit' from source: magic vars 22286 1726882788.17215: starting attempt loop 22286 1726882788.17217: running the handler 22286 1726882788.17220: _low_level_execute_command(): starting 22286 1726882788.17222: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22286 1726882788.17997: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882788.18051: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882788.18124: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882788.18149: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882788.18161: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882788.18315: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882788.20456: stdout chunk (state=3): >>>/root <<< 22286 1726882788.20460: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882788.20539: stdout chunk (state=3): >>><<< 22286 1726882788.20543: stderr chunk (state=3): >>><<< 22286 1726882788.20547: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882788.20550: _low_level_execute_command(): starting 22286 1726882788.20553: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882788.204984-22726-18118315221600 `" && echo ansible-tmp-1726882788.204984-22726-18118315221600="` echo /root/.ansible/tmp/ansible-tmp-1726882788.204984-22726-18118315221600 `" ) && sleep 0' 22286 1726882788.21168: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882788.21177: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882788.21192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882788.21208: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22286 1726882788.21225: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 <<< 22286 1726882788.21337: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882788.21353: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882788.21505: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882788.23588: stdout chunk (state=3): >>>ansible-tmp-1726882788.204984-22726-18118315221600=/root/.ansible/tmp/ansible-tmp-1726882788.204984-22726-18118315221600 <<< 22286 1726882788.23793: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882788.23799: stdout chunk (state=3): >>><<< 22286 1726882788.23803: stderr chunk (state=3): >>><<< 22286 1726882788.23939: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882788.204984-22726-18118315221600=/root/.ansible/tmp/ansible-tmp-1726882788.204984-22726-18118315221600 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882788.23942: variable 'ansible_module_compression' from source: unknown 22286 1726882788.23944: ANSIBALLZ: Using lock for service_facts 22286 1726882788.23946: ANSIBALLZ: Acquiring lock 22286 1726882788.23948: ANSIBALLZ: Lock acquired: 140212085218896 22286 1726882788.23950: ANSIBALLZ: Creating module 22286 1726882788.45151: ANSIBALLZ: Writing module into payload 22286 1726882788.45295: ANSIBALLZ: Writing module 22286 1726882788.45324: ANSIBALLZ: Renaming module 22286 1726882788.45347: ANSIBALLZ: Done creating module 22286 1726882788.45370: variable 'ansible_facts' from source: unknown 22286 1726882788.45467: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882788.204984-22726-18118315221600/AnsiballZ_service_facts.py 22286 1726882788.45743: Sending initial data 22286 1726882788.45746: Sent initial data (160 bytes) 22286 1726882788.46313: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882788.46395: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882788.46399: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882788.46443: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882788.46591: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882788.48415: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 22286 1726882788.48450: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22286 1726882788.48559: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22286 1726882788.48695: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22286nvkga28q/tmp672e89t8 /root/.ansible/tmp/ansible-tmp-1726882788.204984-22726-18118315221600/AnsiballZ_service_facts.py <<< 22286 1726882788.48700: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882788.204984-22726-18118315221600/AnsiballZ_service_facts.py" <<< 22286 1726882788.48801: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-22286nvkga28q/tmp672e89t8" to remote "/root/.ansible/tmp/ansible-tmp-1726882788.204984-22726-18118315221600/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882788.204984-22726-18118315221600/AnsiballZ_service_facts.py" <<< 22286 1726882788.50253: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882788.50482: stderr chunk (state=3): >>><<< 22286 1726882788.50487: stdout chunk (state=3): >>><<< 22286 1726882788.50489: done transferring module to remote 22286 1726882788.50492: _low_level_execute_command(): starting 22286 1726882788.50494: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882788.204984-22726-18118315221600/ /root/.ansible/tmp/ansible-tmp-1726882788.204984-22726-18118315221600/AnsiballZ_service_facts.py && sleep 0' 22286 1726882788.51150: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882788.51154: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882788.51208: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882788.51225: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882788.51249: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882788.51408: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882788.53404: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882788.53413: stdout chunk (state=3): >>><<< 22286 1726882788.53415: stderr chunk (state=3): >>><<< 22286 1726882788.53432: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882788.53438: _low_level_execute_command(): starting 22286 1726882788.53640: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882788.204984-22726-18118315221600/AnsiballZ_service_facts.py && sleep 0' 22286 1726882788.54070: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882788.54088: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882788.54101: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882788.54116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22286 1726882788.54129: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 <<< 22286 1726882788.54141: stderr chunk (state=3): >>>debug2: match not found <<< 22286 1726882788.54152: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882788.54167: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 22286 1726882788.54178: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.41.238 is address <<< 22286 1726882788.54184: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 22286 1726882788.54253: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882788.54293: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882788.54312: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882788.54328: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882788.54480: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882790.50048: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 22286 1726882790.51646: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. <<< 22286 1726882790.51665: stderr chunk (state=3): >>><<< 22286 1726882790.51669: stdout chunk (state=3): >>><<< 22286 1726882790.51694: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. 22286 1726882790.53216: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882788.204984-22726-18118315221600/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22286 1726882790.53235: _low_level_execute_command(): starting 22286 1726882790.53247: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882788.204984-22726-18118315221600/ > /dev/null 2>&1 && sleep 0' 22286 1726882790.54751: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882790.54860: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882790.54867: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882790.55041: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882790.57329: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882790.57333: stdout chunk (state=3): >>><<< 22286 1726882790.57344: stderr chunk (state=3): >>><<< 22286 1726882790.57360: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882790.57368: handler run complete 22286 1726882790.57859: variable 'ansible_facts' from source: unknown 22286 1726882790.58341: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882790.59873: variable 'ansible_facts' from source: unknown 22286 1726882790.63009: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882790.63772: attempt loop complete, returning result 22286 1726882790.63793: _execute() done 22286 1726882790.63796: dumping result to json 22286 1726882790.63869: done dumping result, returning 22286 1726882790.63901: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [0affe814-3a2d-a75d-4836-000000000201] 22286 1726882790.63904: sending task result for task 0affe814-3a2d-a75d-4836-000000000201 22286 1726882790.65868: done sending task result for task 0affe814-3a2d-a75d-4836-000000000201 22286 1726882790.65988: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 22286 1726882790.66054: no more pending results, returning what we have 22286 1726882790.66057: results queue empty 22286 1726882790.66059: checking for any_errors_fatal 22286 1726882790.66063: done checking for any_errors_fatal 22286 1726882790.66064: checking for max_fail_percentage 22286 1726882790.66066: done checking for max_fail_percentage 22286 1726882790.66067: checking to see if all hosts have failed and the running result is not ok 22286 1726882790.66068: done checking to see if all hosts have failed 22286 1726882790.66069: getting the remaining hosts for this loop 22286 1726882790.66071: done getting the remaining hosts for this loop 22286 1726882790.66078: getting the next task for host managed_node3 22286 1726882790.66086: done getting next task for host managed_node3 22286 1726882790.66095: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 22286 1726882790.66101: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882790.66113: getting variables 22286 1726882790.66115: in VariableManager get_vars() 22286 1726882790.66155: Calling all_inventory to load vars for managed_node3 22286 1726882790.66159: Calling groups_inventory to load vars for managed_node3 22286 1726882790.66161: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882790.66171: Calling all_plugins_play to load vars for managed_node3 22286 1726882790.66173: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882790.66179: Calling groups_plugins_play to load vars for managed_node3 22286 1726882790.67546: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882790.69265: done with get_vars() 22286 1726882790.69286: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:39:50 -0400 (0:00:02.548) 0:00:14.089 ****** 22286 1726882790.69606: entering _queue_task() for managed_node3/package_facts 22286 1726882790.69608: Creating lock for package_facts 22286 1726882790.70324: worker is 1 (out of 1 available) 22286 1726882790.70397: exiting _queue_task() for managed_node3/package_facts 22286 1726882790.70413: done queuing things up, now waiting for results queue to drain 22286 1726882790.70415: waiting for pending results... 22286 1726882790.70852: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 22286 1726882790.70871: in run() - task 0affe814-3a2d-a75d-4836-000000000202 22286 1726882790.70894: variable 'ansible_search_path' from source: unknown 22286 1726882790.70902: variable 'ansible_search_path' from source: unknown 22286 1726882790.70948: calling self._execute() 22286 1726882790.71055: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882790.71068: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882790.71090: variable 'omit' from source: magic vars 22286 1726882790.71569: variable 'ansible_distribution_major_version' from source: facts 22286 1726882790.71591: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882790.71603: variable 'omit' from source: magic vars 22286 1726882790.71718: variable 'omit' from source: magic vars 22286 1726882790.71781: variable 'omit' from source: magic vars 22286 1726882790.71832: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22286 1726882790.71887: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22286 1726882790.71916: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22286 1726882790.71945: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882790.71967: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882790.72009: variable 'inventory_hostname' from source: host vars for 'managed_node3' 22286 1726882790.72077: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882790.72081: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882790.72166: Set connection var ansible_shell_executable to /bin/sh 22286 1726882790.72188: Set connection var ansible_module_compression to ZIP_DEFLATED 22286 1726882790.72197: Set connection var ansible_connection to ssh 22286 1726882790.72208: Set connection var ansible_shell_type to sh 22286 1726882790.72221: Set connection var ansible_timeout to 10 22286 1726882790.72238: Set connection var ansible_pipelining to False 22286 1726882790.72271: variable 'ansible_shell_executable' from source: unknown 22286 1726882790.72281: variable 'ansible_connection' from source: unknown 22286 1726882790.72295: variable 'ansible_module_compression' from source: unknown 22286 1726882790.72304: variable 'ansible_shell_type' from source: unknown 22286 1726882790.72316: variable 'ansible_shell_executable' from source: unknown 22286 1726882790.72405: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882790.72409: variable 'ansible_pipelining' from source: unknown 22286 1726882790.72411: variable 'ansible_timeout' from source: unknown 22286 1726882790.72415: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882790.72602: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 22286 1726882790.72639: variable 'omit' from source: magic vars 22286 1726882790.72642: starting attempt loop 22286 1726882790.72732: running the handler 22286 1726882790.72737: _low_level_execute_command(): starting 22286 1726882790.72740: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22286 1726882790.73510: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882790.73607: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882790.73658: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882790.73772: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882790.75619: stdout chunk (state=3): >>>/root <<< 22286 1726882790.75733: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882790.75827: stderr chunk (state=3): >>><<< 22286 1726882790.75854: stdout chunk (state=3): >>><<< 22286 1726882790.75972: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882790.75976: _low_level_execute_command(): starting 22286 1726882790.75979: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882790.758772-22799-63338935186916 `" && echo ansible-tmp-1726882790.758772-22799-63338935186916="` echo /root/.ansible/tmp/ansible-tmp-1726882790.758772-22799-63338935186916 `" ) && sleep 0' 22286 1726882790.76516: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882790.76533: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882790.76655: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 22286 1726882790.76683: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882790.76698: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882790.76848: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882790.78928: stdout chunk (state=3): >>>ansible-tmp-1726882790.758772-22799-63338935186916=/root/.ansible/tmp/ansible-tmp-1726882790.758772-22799-63338935186916 <<< 22286 1726882790.79118: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882790.79146: stdout chunk (state=3): >>><<< 22286 1726882790.79150: stderr chunk (state=3): >>><<< 22286 1726882790.79169: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882790.758772-22799-63338935186916=/root/.ansible/tmp/ansible-tmp-1726882790.758772-22799-63338935186916 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882790.79340: variable 'ansible_module_compression' from source: unknown 22286 1726882790.79343: ANSIBALLZ: Using lock for package_facts 22286 1726882790.79345: ANSIBALLZ: Acquiring lock 22286 1726882790.79347: ANSIBALLZ: Lock acquired: 140212085344736 22286 1726882790.79349: ANSIBALLZ: Creating module 22286 1726882791.30843: ANSIBALLZ: Writing module into payload 22286 1726882791.31305: ANSIBALLZ: Writing module 22286 1726882791.31311: ANSIBALLZ: Renaming module 22286 1726882791.31347: ANSIBALLZ: Done creating module 22286 1726882791.31390: variable 'ansible_facts' from source: unknown 22286 1726882791.32059: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882790.758772-22799-63338935186916/AnsiballZ_package_facts.py 22286 1726882791.32277: Sending initial data 22286 1726882791.32289: Sent initial data (160 bytes) 22286 1726882791.33695: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882791.33850: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882791.33922: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882791.33942: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882791.34022: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882791.34175: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882791.36072: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 22286 1726882791.36145: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22286 1726882791.36178: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22286 1726882791.36314: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22286nvkga28q/tmp469kuk4v /root/.ansible/tmp/ansible-tmp-1726882790.758772-22799-63338935186916/AnsiballZ_package_facts.py <<< 22286 1726882791.36327: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882790.758772-22799-63338935186916/AnsiballZ_package_facts.py" <<< 22286 1726882791.36468: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-22286nvkga28q/tmp469kuk4v" to remote "/root/.ansible/tmp/ansible-tmp-1726882790.758772-22799-63338935186916/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882790.758772-22799-63338935186916/AnsiballZ_package_facts.py" <<< 22286 1726882791.39982: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882791.40119: stderr chunk (state=3): >>><<< 22286 1726882791.40122: stdout chunk (state=3): >>><<< 22286 1726882791.40175: done transferring module to remote 22286 1726882791.40188: _low_level_execute_command(): starting 22286 1726882791.40194: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882790.758772-22799-63338935186916/ /root/.ansible/tmp/ansible-tmp-1726882790.758772-22799-63338935186916/AnsiballZ_package_facts.py && sleep 0' 22286 1726882791.41161: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882791.41250: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882791.41284: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882791.41312: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882791.41315: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882791.41474: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882791.43674: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882791.43678: stdout chunk (state=3): >>><<< 22286 1726882791.43680: stderr chunk (state=3): >>><<< 22286 1726882791.43783: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882791.43786: _low_level_execute_command(): starting 22286 1726882791.43789: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882790.758772-22799-63338935186916/AnsiballZ_package_facts.py && sleep 0' 22286 1726882791.44898: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882791.44910: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882791.45002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882791.45043: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882791.45065: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882791.45077: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882791.45239: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882792.10402: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "a<<< 22286 1726882792.10473: stdout chunk (state=3): >>>rch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, <<< 22286 1726882792.10696: stdout chunk (state=3): >>>"arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "<<< 22286 1726882792.10703: stdout chunk (state=3): >>>version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null<<< 22286 1726882792.10707: stdout chunk (state=3): >>>, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "sou<<< 22286 1726882792.10712: stdout chunk (state=3): >>>rce": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "a<<< 22286 1726882792.10714: stdout chunk (state=3): >>>spell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", <<< 22286 1726882792.10719: stdout chunk (state=3): >>>"source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "a<<< 22286 1726882792.10721: stdout chunk (state=3): >>>rch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 22286 1726882792.13122: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. <<< 22286 1726882792.13125: stdout chunk (state=3): >>><<< 22286 1726882792.13128: stderr chunk (state=3): >>><<< 22286 1726882792.13349: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. 22286 1726882792.21436: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882790.758772-22799-63338935186916/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22286 1726882792.21767: _low_level_execute_command(): starting 22286 1726882792.21770: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882790.758772-22799-63338935186916/ > /dev/null 2>&1 && sleep 0' 22286 1726882792.23013: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882792.23023: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882792.23035: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882792.23053: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22286 1726882792.23344: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882792.23361: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882792.23594: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882792.25666: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882792.25747: stderr chunk (state=3): >>><<< 22286 1726882792.25814: stdout chunk (state=3): >>><<< 22286 1726882792.25817: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882792.25820: handler run complete 22286 1726882792.28566: variable 'ansible_facts' from source: unknown 22286 1726882792.29310: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882792.32857: variable 'ansible_facts' from source: unknown 22286 1726882792.37715: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882792.39048: attempt loop complete, returning result 22286 1726882792.39071: _execute() done 22286 1726882792.39075: dumping result to json 22286 1726882792.39405: done dumping result, returning 22286 1726882792.39440: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affe814-3a2d-a75d-4836-000000000202] 22286 1726882792.39444: sending task result for task 0affe814-3a2d-a75d-4836-000000000202 22286 1726882792.43710: done sending task result for task 0affe814-3a2d-a75d-4836-000000000202 22286 1726882792.43713: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 22286 1726882792.43817: no more pending results, returning what we have 22286 1726882792.43820: results queue empty 22286 1726882792.43821: checking for any_errors_fatal 22286 1726882792.43826: done checking for any_errors_fatal 22286 1726882792.43827: checking for max_fail_percentage 22286 1726882792.43829: done checking for max_fail_percentage 22286 1726882792.43830: checking to see if all hosts have failed and the running result is not ok 22286 1726882792.43831: done checking to see if all hosts have failed 22286 1726882792.43832: getting the remaining hosts for this loop 22286 1726882792.43833: done getting the remaining hosts for this loop 22286 1726882792.43839: getting the next task for host managed_node3 22286 1726882792.43847: done getting next task for host managed_node3 22286 1726882792.43850: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 22286 1726882792.43853: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882792.43864: getting variables 22286 1726882792.43866: in VariableManager get_vars() 22286 1726882792.43901: Calling all_inventory to load vars for managed_node3 22286 1726882792.43905: Calling groups_inventory to load vars for managed_node3 22286 1726882792.43908: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882792.43918: Calling all_plugins_play to load vars for managed_node3 22286 1726882792.43922: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882792.43926: Calling groups_plugins_play to load vars for managed_node3 22286 1726882792.47831: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882792.52794: done with get_vars() 22286 1726882792.52829: done getting variables 22286 1726882792.52902: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:39:52 -0400 (0:00:01.833) 0:00:15.922 ****** 22286 1726882792.52944: entering _queue_task() for managed_node3/debug 22286 1726882792.53282: worker is 1 (out of 1 available) 22286 1726882792.53298: exiting _queue_task() for managed_node3/debug 22286 1726882792.53313: done queuing things up, now waiting for results queue to drain 22286 1726882792.53315: waiting for pending results... 22286 1726882792.53737: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 22286 1726882792.53764: in run() - task 0affe814-3a2d-a75d-4836-000000000018 22286 1726882792.53791: variable 'ansible_search_path' from source: unknown 22286 1726882792.53800: variable 'ansible_search_path' from source: unknown 22286 1726882792.53852: calling self._execute() 22286 1726882792.53961: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882792.54053: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882792.54162: variable 'omit' from source: magic vars 22286 1726882792.54932: variable 'ansible_distribution_major_version' from source: facts 22286 1726882792.55078: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882792.55096: variable 'omit' from source: magic vars 22286 1726882792.55240: variable 'omit' from source: magic vars 22286 1726882792.55563: variable 'network_provider' from source: set_fact 22286 1726882792.55615: variable 'omit' from source: magic vars 22286 1726882792.55668: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22286 1726882792.55715: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22286 1726882792.55756: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22286 1726882792.55767: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882792.55784: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882792.55823: variable 'inventory_hostname' from source: host vars for 'managed_node3' 22286 1726882792.55829: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882792.55839: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882792.55973: Set connection var ansible_shell_executable to /bin/sh 22286 1726882792.55987: Set connection var ansible_module_compression to ZIP_DEFLATED 22286 1726882792.55991: Set connection var ansible_connection to ssh 22286 1726882792.55993: Set connection var ansible_shell_type to sh 22286 1726882792.56002: Set connection var ansible_timeout to 10 22286 1726882792.56013: Set connection var ansible_pipelining to False 22286 1726882792.56042: variable 'ansible_shell_executable' from source: unknown 22286 1726882792.56046: variable 'ansible_connection' from source: unknown 22286 1726882792.56049: variable 'ansible_module_compression' from source: unknown 22286 1726882792.56051: variable 'ansible_shell_type' from source: unknown 22286 1726882792.56059: variable 'ansible_shell_executable' from source: unknown 22286 1726882792.56062: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882792.56072: variable 'ansible_pipelining' from source: unknown 22286 1726882792.56076: variable 'ansible_timeout' from source: unknown 22286 1726882792.56081: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882792.56262: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22286 1726882792.56278: variable 'omit' from source: magic vars 22286 1726882792.56281: starting attempt loop 22286 1726882792.56290: running the handler 22286 1726882792.56345: handler run complete 22286 1726882792.56363: attempt loop complete, returning result 22286 1726882792.56366: _execute() done 22286 1726882792.56369: dumping result to json 22286 1726882792.56374: done dumping result, returning 22286 1726882792.56387: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [0affe814-3a2d-a75d-4836-000000000018] 22286 1726882792.56399: sending task result for task 0affe814-3a2d-a75d-4836-000000000018 ok: [managed_node3] => {} MSG: Using network provider: nm 22286 1726882792.56564: no more pending results, returning what we have 22286 1726882792.56567: results queue empty 22286 1726882792.56568: checking for any_errors_fatal 22286 1726882792.56585: done checking for any_errors_fatal 22286 1726882792.56586: checking for max_fail_percentage 22286 1726882792.56588: done checking for max_fail_percentage 22286 1726882792.56589: checking to see if all hosts have failed and the running result is not ok 22286 1726882792.56590: done checking to see if all hosts have failed 22286 1726882792.56591: getting the remaining hosts for this loop 22286 1726882792.56593: done getting the remaining hosts for this loop 22286 1726882792.56597: getting the next task for host managed_node3 22286 1726882792.56604: done getting next task for host managed_node3 22286 1726882792.56609: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 22286 1726882792.56612: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882792.56624: getting variables 22286 1726882792.56626: in VariableManager get_vars() 22286 1726882792.56669: Calling all_inventory to load vars for managed_node3 22286 1726882792.56673: Calling groups_inventory to load vars for managed_node3 22286 1726882792.56675: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882792.56685: Calling all_plugins_play to load vars for managed_node3 22286 1726882792.56688: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882792.56691: Calling groups_plugins_play to load vars for managed_node3 22286 1726882792.57492: done sending task result for task 0affe814-3a2d-a75d-4836-000000000018 22286 1726882792.57496: WORKER PROCESS EXITING 22286 1726882792.61851: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882792.66747: done with get_vars() 22286 1726882792.66801: done getting variables 22286 1726882792.66872: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:39:52 -0400 (0:00:00.139) 0:00:16.062 ****** 22286 1726882792.66923: entering _queue_task() for managed_node3/fail 22286 1726882792.67367: worker is 1 (out of 1 available) 22286 1726882792.67383: exiting _queue_task() for managed_node3/fail 22286 1726882792.67396: done queuing things up, now waiting for results queue to drain 22286 1726882792.67398: waiting for pending results... 22286 1726882792.67623: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 22286 1726882792.67846: in run() - task 0affe814-3a2d-a75d-4836-000000000019 22286 1726882792.67850: variable 'ansible_search_path' from source: unknown 22286 1726882792.67853: variable 'ansible_search_path' from source: unknown 22286 1726882792.67942: calling self._execute() 22286 1726882792.68058: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882792.68101: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882792.68141: variable 'omit' from source: magic vars 22286 1726882792.68828: variable 'ansible_distribution_major_version' from source: facts 22286 1726882792.68856: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882792.69354: variable 'network_state' from source: role '' defaults 22286 1726882792.69358: Evaluated conditional (network_state != {}): False 22286 1726882792.69361: when evaluation is False, skipping this task 22286 1726882792.69364: _execute() done 22286 1726882792.69366: dumping result to json 22286 1726882792.69370: done dumping result, returning 22286 1726882792.69373: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affe814-3a2d-a75d-4836-000000000019] 22286 1726882792.69379: sending task result for task 0affe814-3a2d-a75d-4836-000000000019 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 22286 1726882792.69730: no more pending results, returning what we have 22286 1726882792.69737: results queue empty 22286 1726882792.69738: checking for any_errors_fatal 22286 1726882792.69745: done checking for any_errors_fatal 22286 1726882792.69746: checking for max_fail_percentage 22286 1726882792.69748: done checking for max_fail_percentage 22286 1726882792.69749: checking to see if all hosts have failed and the running result is not ok 22286 1726882792.69751: done checking to see if all hosts have failed 22286 1726882792.69752: getting the remaining hosts for this loop 22286 1726882792.69754: done getting the remaining hosts for this loop 22286 1726882792.69759: getting the next task for host managed_node3 22286 1726882792.69768: done getting next task for host managed_node3 22286 1726882792.69773: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 22286 1726882792.69781: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882792.69802: getting variables 22286 1726882792.69804: in VariableManager get_vars() 22286 1726882792.70063: Calling all_inventory to load vars for managed_node3 22286 1726882792.70067: Calling groups_inventory to load vars for managed_node3 22286 1726882792.70070: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882792.70087: Calling all_plugins_play to load vars for managed_node3 22286 1726882792.70091: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882792.70096: Calling groups_plugins_play to load vars for managed_node3 22286 1726882792.70835: done sending task result for task 0affe814-3a2d-a75d-4836-000000000019 22286 1726882792.70839: WORKER PROCESS EXITING 22286 1726882792.73222: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882792.76326: done with get_vars() 22286 1726882792.76367: done getting variables 22286 1726882792.76439: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:39:52 -0400 (0:00:00.095) 0:00:16.158 ****** 22286 1726882792.76485: entering _queue_task() for managed_node3/fail 22286 1726882792.76815: worker is 1 (out of 1 available) 22286 1726882792.76829: exiting _queue_task() for managed_node3/fail 22286 1726882792.76846: done queuing things up, now waiting for results queue to drain 22286 1726882792.76848: waiting for pending results... 22286 1726882792.77098: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 22286 1726882792.77261: in run() - task 0affe814-3a2d-a75d-4836-00000000001a 22286 1726882792.77283: variable 'ansible_search_path' from source: unknown 22286 1726882792.77294: variable 'ansible_search_path' from source: unknown 22286 1726882792.77339: calling self._execute() 22286 1726882792.77443: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882792.77463: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882792.77481: variable 'omit' from source: magic vars 22286 1726882792.77926: variable 'ansible_distribution_major_version' from source: facts 22286 1726882792.77949: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882792.78111: variable 'network_state' from source: role '' defaults 22286 1726882792.78130: Evaluated conditional (network_state != {}): False 22286 1726882792.78141: when evaluation is False, skipping this task 22286 1726882792.78149: _execute() done 22286 1726882792.78156: dumping result to json 22286 1726882792.78164: done dumping result, returning 22286 1726882792.78223: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affe814-3a2d-a75d-4836-00000000001a] 22286 1726882792.78227: sending task result for task 0affe814-3a2d-a75d-4836-00000000001a 22286 1726882792.78306: done sending task result for task 0affe814-3a2d-a75d-4836-00000000001a skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 22286 1726882792.78390: no more pending results, returning what we have 22286 1726882792.78395: results queue empty 22286 1726882792.78396: checking for any_errors_fatal 22286 1726882792.78408: done checking for any_errors_fatal 22286 1726882792.78409: checking for max_fail_percentage 22286 1726882792.78411: done checking for max_fail_percentage 22286 1726882792.78412: checking to see if all hosts have failed and the running result is not ok 22286 1726882792.78413: done checking to see if all hosts have failed 22286 1726882792.78414: getting the remaining hosts for this loop 22286 1726882792.78416: done getting the remaining hosts for this loop 22286 1726882792.78421: getting the next task for host managed_node3 22286 1726882792.78430: done getting next task for host managed_node3 22286 1726882792.78437: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 22286 1726882792.78441: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882792.78462: getting variables 22286 1726882792.78464: in VariableManager get_vars() 22286 1726882792.78513: Calling all_inventory to load vars for managed_node3 22286 1726882792.78517: Calling groups_inventory to load vars for managed_node3 22286 1726882792.78520: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882792.78533: Calling all_plugins_play to load vars for managed_node3 22286 1726882792.78746: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882792.78752: WORKER PROCESS EXITING 22286 1726882792.78757: Calling groups_plugins_play to load vars for managed_node3 22286 1726882792.80961: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882792.84058: done with get_vars() 22286 1726882792.84092: done getting variables 22286 1726882792.84167: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:39:52 -0400 (0:00:00.077) 0:00:16.235 ****** 22286 1726882792.84205: entering _queue_task() for managed_node3/fail 22286 1726882792.84568: worker is 1 (out of 1 available) 22286 1726882792.84582: exiting _queue_task() for managed_node3/fail 22286 1726882792.84645: done queuing things up, now waiting for results queue to drain 22286 1726882792.84647: waiting for pending results... 22286 1726882792.84856: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 22286 1726882792.85032: in run() - task 0affe814-3a2d-a75d-4836-00000000001b 22286 1726882792.85061: variable 'ansible_search_path' from source: unknown 22286 1726882792.85071: variable 'ansible_search_path' from source: unknown 22286 1726882792.85118: calling self._execute() 22286 1726882792.85229: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882792.85250: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882792.85275: variable 'omit' from source: magic vars 22286 1726882792.85739: variable 'ansible_distribution_major_version' from source: facts 22286 1726882792.85759: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882792.86004: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22286 1726882792.87814: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22286 1726882792.87875: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22286 1726882792.87906: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22286 1726882792.87938: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22286 1726882792.87961: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22286 1726882792.88050: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22286 1726882792.88070: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22286 1726882792.88097: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882792.88253: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22286 1726882792.88257: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22286 1726882792.88274: variable 'ansible_distribution_major_version' from source: facts 22286 1726882792.88299: Evaluated conditional (ansible_distribution_major_version | int > 9): True 22286 1726882792.88439: variable 'ansible_distribution' from source: facts 22286 1726882792.88443: variable '__network_rh_distros' from source: role '' defaults 22286 1726882792.88446: Evaluated conditional (ansible_distribution in __network_rh_distros): False 22286 1726882792.88449: when evaluation is False, skipping this task 22286 1726882792.88452: _execute() done 22286 1726882792.88454: dumping result to json 22286 1726882792.88457: done dumping result, returning 22286 1726882792.88658: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affe814-3a2d-a75d-4836-00000000001b] 22286 1726882792.88661: sending task result for task 0affe814-3a2d-a75d-4836-00000000001b 22286 1726882792.88728: done sending task result for task 0affe814-3a2d-a75d-4836-00000000001b 22286 1726882792.88732: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 22286 1726882792.88804: no more pending results, returning what we have 22286 1726882792.88807: results queue empty 22286 1726882792.88809: checking for any_errors_fatal 22286 1726882792.88814: done checking for any_errors_fatal 22286 1726882792.88815: checking for max_fail_percentage 22286 1726882792.88817: done checking for max_fail_percentage 22286 1726882792.88818: checking to see if all hosts have failed and the running result is not ok 22286 1726882792.88819: done checking to see if all hosts have failed 22286 1726882792.88820: getting the remaining hosts for this loop 22286 1726882792.88821: done getting the remaining hosts for this loop 22286 1726882792.88826: getting the next task for host managed_node3 22286 1726882792.88832: done getting next task for host managed_node3 22286 1726882792.88841: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 22286 1726882792.88844: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882792.88863: getting variables 22286 1726882792.88865: in VariableManager get_vars() 22286 1726882792.88912: Calling all_inventory to load vars for managed_node3 22286 1726882792.88915: Calling groups_inventory to load vars for managed_node3 22286 1726882792.88918: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882792.88928: Calling all_plugins_play to load vars for managed_node3 22286 1726882792.88932: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882792.88938: Calling groups_plugins_play to load vars for managed_node3 22286 1726882792.90350: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882792.92467: done with get_vars() 22286 1726882792.92549: done getting variables 22286 1726882792.92671: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:39:52 -0400 (0:00:00.084) 0:00:16.320 ****** 22286 1726882792.92696: entering _queue_task() for managed_node3/dnf 22286 1726882792.92927: worker is 1 (out of 1 available) 22286 1726882792.92942: exiting _queue_task() for managed_node3/dnf 22286 1726882792.92955: done queuing things up, now waiting for results queue to drain 22286 1726882792.92957: waiting for pending results... 22286 1726882792.93145: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 22286 1726882792.93233: in run() - task 0affe814-3a2d-a75d-4836-00000000001c 22286 1726882792.93249: variable 'ansible_search_path' from source: unknown 22286 1726882792.93253: variable 'ansible_search_path' from source: unknown 22286 1726882792.93290: calling self._execute() 22286 1726882792.93368: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882792.93374: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882792.93389: variable 'omit' from source: magic vars 22286 1726882792.93704: variable 'ansible_distribution_major_version' from source: facts 22286 1726882792.93716: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882792.93889: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22286 1726882792.96259: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22286 1726882792.96746: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22286 1726882792.96750: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22286 1726882792.96767: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22286 1726882792.96800: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22286 1726882792.96894: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22286 1726882792.96929: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22286 1726882792.96963: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882792.97015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22286 1726882792.97031: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22286 1726882792.97162: variable 'ansible_distribution' from source: facts 22286 1726882792.97166: variable 'ansible_distribution_major_version' from source: facts 22286 1726882792.97181: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 22286 1726882792.97319: variable '__network_wireless_connections_defined' from source: role '' defaults 22286 1726882792.97489: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22286 1726882792.97519: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22286 1726882792.97541: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882792.97573: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22286 1726882792.97588: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22286 1726882792.97627: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22286 1726882792.97657: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22286 1726882792.97677: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882792.97710: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22286 1726882792.97724: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22286 1726882792.97762: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22286 1726882792.97785: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22286 1726882792.97805: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882792.97838: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22286 1726882792.97855: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22286 1726882792.97984: variable 'network_connections' from source: task vars 22286 1726882792.97995: variable 'interface' from source: play vars 22286 1726882792.98054: variable 'interface' from source: play vars 22286 1726882792.98118: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22286 1726882792.98252: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22286 1726882792.98288: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22286 1726882792.98316: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22286 1726882792.98342: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22286 1726882792.98378: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22286 1726882792.98404: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22286 1726882792.98428: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882792.98451: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22286 1726882792.98507: variable '__network_team_connections_defined' from source: role '' defaults 22286 1726882792.98716: variable 'network_connections' from source: task vars 22286 1726882792.98720: variable 'interface' from source: play vars 22286 1726882792.98774: variable 'interface' from source: play vars 22286 1726882792.98804: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 22286 1726882792.98809: when evaluation is False, skipping this task 22286 1726882792.98813: _execute() done 22286 1726882792.98815: dumping result to json 22286 1726882792.98817: done dumping result, returning 22286 1726882792.98827: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affe814-3a2d-a75d-4836-00000000001c] 22286 1726882792.98834: sending task result for task 0affe814-3a2d-a75d-4836-00000000001c 22286 1726882792.98925: done sending task result for task 0affe814-3a2d-a75d-4836-00000000001c 22286 1726882792.98932: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 22286 1726882792.98988: no more pending results, returning what we have 22286 1726882792.98991: results queue empty 22286 1726882792.98992: checking for any_errors_fatal 22286 1726882792.98998: done checking for any_errors_fatal 22286 1726882792.98999: checking for max_fail_percentage 22286 1726882792.99001: done checking for max_fail_percentage 22286 1726882792.99002: checking to see if all hosts have failed and the running result is not ok 22286 1726882792.99003: done checking to see if all hosts have failed 22286 1726882792.99004: getting the remaining hosts for this loop 22286 1726882792.99006: done getting the remaining hosts for this loop 22286 1726882792.99010: getting the next task for host managed_node3 22286 1726882792.99017: done getting next task for host managed_node3 22286 1726882792.99021: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 22286 1726882792.99025: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882792.99049: getting variables 22286 1726882792.99051: in VariableManager get_vars() 22286 1726882792.99091: Calling all_inventory to load vars for managed_node3 22286 1726882792.99095: Calling groups_inventory to load vars for managed_node3 22286 1726882792.99097: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882792.99107: Calling all_plugins_play to load vars for managed_node3 22286 1726882792.99110: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882792.99113: Calling groups_plugins_play to load vars for managed_node3 22286 1726882793.00970: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882793.02535: done with get_vars() 22286 1726882793.02556: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 22286 1726882793.02617: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:39:53 -0400 (0:00:00.099) 0:00:16.419 ****** 22286 1726882793.02646: entering _queue_task() for managed_node3/yum 22286 1726882793.02647: Creating lock for yum 22286 1726882793.02864: worker is 1 (out of 1 available) 22286 1726882793.02879: exiting _queue_task() for managed_node3/yum 22286 1726882793.02893: done queuing things up, now waiting for results queue to drain 22286 1726882793.02894: waiting for pending results... 22286 1726882793.03081: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 22286 1726882793.03186: in run() - task 0affe814-3a2d-a75d-4836-00000000001d 22286 1726882793.03199: variable 'ansible_search_path' from source: unknown 22286 1726882793.03202: variable 'ansible_search_path' from source: unknown 22286 1726882793.03238: calling self._execute() 22286 1726882793.03314: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882793.03320: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882793.03330: variable 'omit' from source: magic vars 22286 1726882793.03637: variable 'ansible_distribution_major_version' from source: facts 22286 1726882793.03649: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882793.03800: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22286 1726882793.05502: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22286 1726882793.05564: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22286 1726882793.05595: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22286 1726882793.05626: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22286 1726882793.05653: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22286 1726882793.05720: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22286 1726882793.05749: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22286 1726882793.05770: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882793.05804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22286 1726882793.05816: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22286 1726882793.05897: variable 'ansible_distribution_major_version' from source: facts 22286 1726882793.05909: Evaluated conditional (ansible_distribution_major_version | int < 8): False 22286 1726882793.05912: when evaluation is False, skipping this task 22286 1726882793.05915: _execute() done 22286 1726882793.05920: dumping result to json 22286 1726882793.05925: done dumping result, returning 22286 1726882793.05932: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affe814-3a2d-a75d-4836-00000000001d] 22286 1726882793.05939: sending task result for task 0affe814-3a2d-a75d-4836-00000000001d 22286 1726882793.06037: done sending task result for task 0affe814-3a2d-a75d-4836-00000000001d 22286 1726882793.06040: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 22286 1726882793.06109: no more pending results, returning what we have 22286 1726882793.06112: results queue empty 22286 1726882793.06114: checking for any_errors_fatal 22286 1726882793.06119: done checking for any_errors_fatal 22286 1726882793.06120: checking for max_fail_percentage 22286 1726882793.06122: done checking for max_fail_percentage 22286 1726882793.06123: checking to see if all hosts have failed and the running result is not ok 22286 1726882793.06124: done checking to see if all hosts have failed 22286 1726882793.06125: getting the remaining hosts for this loop 22286 1726882793.06126: done getting the remaining hosts for this loop 22286 1726882793.06130: getting the next task for host managed_node3 22286 1726882793.06138: done getting next task for host managed_node3 22286 1726882793.06142: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 22286 1726882793.06145: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882793.06159: getting variables 22286 1726882793.06161: in VariableManager get_vars() 22286 1726882793.06202: Calling all_inventory to load vars for managed_node3 22286 1726882793.06206: Calling groups_inventory to load vars for managed_node3 22286 1726882793.06209: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882793.06218: Calling all_plugins_play to load vars for managed_node3 22286 1726882793.06221: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882793.06230: Calling groups_plugins_play to load vars for managed_node3 22286 1726882793.07421: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882793.11441: done with get_vars() 22286 1726882793.11465: done getting variables 22286 1726882793.11504: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:39:53 -0400 (0:00:00.088) 0:00:16.508 ****** 22286 1726882793.11524: entering _queue_task() for managed_node3/fail 22286 1726882793.11759: worker is 1 (out of 1 available) 22286 1726882793.11772: exiting _queue_task() for managed_node3/fail 22286 1726882793.11788: done queuing things up, now waiting for results queue to drain 22286 1726882793.11790: waiting for pending results... 22286 1726882793.11972: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 22286 1726882793.12080: in run() - task 0affe814-3a2d-a75d-4836-00000000001e 22286 1726882793.12097: variable 'ansible_search_path' from source: unknown 22286 1726882793.12101: variable 'ansible_search_path' from source: unknown 22286 1726882793.12138: calling self._execute() 22286 1726882793.12212: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882793.12218: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882793.12231: variable 'omit' from source: magic vars 22286 1726882793.12553: variable 'ansible_distribution_major_version' from source: facts 22286 1726882793.12565: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882793.12667: variable '__network_wireless_connections_defined' from source: role '' defaults 22286 1726882793.12840: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22286 1726882793.14554: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22286 1726882793.14616: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22286 1726882793.14653: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22286 1726882793.14686: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22286 1726882793.14708: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22286 1726882793.14778: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22286 1726882793.14805: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22286 1726882793.14826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882793.14865: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22286 1726882793.14887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22286 1726882793.14925: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22286 1726882793.14947: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22286 1726882793.14968: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882793.15006: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22286 1726882793.15018: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22286 1726882793.15054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22286 1726882793.15074: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22286 1726882793.15102: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882793.15131: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22286 1726882793.15146: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22286 1726882793.15290: variable 'network_connections' from source: task vars 22286 1726882793.15299: variable 'interface' from source: play vars 22286 1726882793.15361: variable 'interface' from source: play vars 22286 1726882793.15420: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22286 1726882793.15563: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22286 1726882793.15595: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22286 1726882793.15621: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22286 1726882793.15654: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22286 1726882793.15689: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22286 1726882793.15708: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22286 1726882793.15729: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882793.15756: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22286 1726882793.15806: variable '__network_team_connections_defined' from source: role '' defaults 22286 1726882793.16009: variable 'network_connections' from source: task vars 22286 1726882793.16015: variable 'interface' from source: play vars 22286 1726882793.16069: variable 'interface' from source: play vars 22286 1726882793.16099: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 22286 1726882793.16103: when evaluation is False, skipping this task 22286 1726882793.16106: _execute() done 22286 1726882793.16108: dumping result to json 22286 1726882793.16111: done dumping result, returning 22286 1726882793.16119: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affe814-3a2d-a75d-4836-00000000001e] 22286 1726882793.16125: sending task result for task 0affe814-3a2d-a75d-4836-00000000001e 22286 1726882793.16217: done sending task result for task 0affe814-3a2d-a75d-4836-00000000001e 22286 1726882793.16221: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 22286 1726882793.16284: no more pending results, returning what we have 22286 1726882793.16287: results queue empty 22286 1726882793.16288: checking for any_errors_fatal 22286 1726882793.16295: done checking for any_errors_fatal 22286 1726882793.16296: checking for max_fail_percentage 22286 1726882793.16298: done checking for max_fail_percentage 22286 1726882793.16299: checking to see if all hosts have failed and the running result is not ok 22286 1726882793.16300: done checking to see if all hosts have failed 22286 1726882793.16301: getting the remaining hosts for this loop 22286 1726882793.16303: done getting the remaining hosts for this loop 22286 1726882793.16307: getting the next task for host managed_node3 22286 1726882793.16314: done getting next task for host managed_node3 22286 1726882793.16318: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 22286 1726882793.16321: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882793.16346: getting variables 22286 1726882793.16348: in VariableManager get_vars() 22286 1726882793.16388: Calling all_inventory to load vars for managed_node3 22286 1726882793.16391: Calling groups_inventory to load vars for managed_node3 22286 1726882793.16394: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882793.16403: Calling all_plugins_play to load vars for managed_node3 22286 1726882793.16406: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882793.16409: Calling groups_plugins_play to load vars for managed_node3 22286 1726882793.17622: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882793.19212: done with get_vars() 22286 1726882793.19232: done getting variables 22286 1726882793.19282: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:39:53 -0400 (0:00:00.077) 0:00:16.586 ****** 22286 1726882793.19308: entering _queue_task() for managed_node3/package 22286 1726882793.19521: worker is 1 (out of 1 available) 22286 1726882793.19536: exiting _queue_task() for managed_node3/package 22286 1726882793.19550: done queuing things up, now waiting for results queue to drain 22286 1726882793.19552: waiting for pending results... 22286 1726882793.19737: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 22286 1726882793.19843: in run() - task 0affe814-3a2d-a75d-4836-00000000001f 22286 1726882793.19855: variable 'ansible_search_path' from source: unknown 22286 1726882793.19858: variable 'ansible_search_path' from source: unknown 22286 1726882793.19896: calling self._execute() 22286 1726882793.19971: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882793.19979: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882793.19993: variable 'omit' from source: magic vars 22286 1726882793.20313: variable 'ansible_distribution_major_version' from source: facts 22286 1726882793.20329: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882793.20493: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22286 1726882793.20720: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22286 1726882793.20761: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22286 1726882793.20818: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22286 1726882793.20853: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22286 1726882793.20944: variable 'network_packages' from source: role '' defaults 22286 1726882793.21037: variable '__network_provider_setup' from source: role '' defaults 22286 1726882793.21047: variable '__network_service_name_default_nm' from source: role '' defaults 22286 1726882793.21108: variable '__network_service_name_default_nm' from source: role '' defaults 22286 1726882793.21117: variable '__network_packages_default_nm' from source: role '' defaults 22286 1726882793.21171: variable '__network_packages_default_nm' from source: role '' defaults 22286 1726882793.21337: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22286 1726882793.23147: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22286 1726882793.23200: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22286 1726882793.23228: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22286 1726882793.23257: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22286 1726882793.23285: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22286 1726882793.23348: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22286 1726882793.23373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22286 1726882793.23400: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882793.23432: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22286 1726882793.23446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22286 1726882793.23491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22286 1726882793.23511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22286 1726882793.23531: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882793.23564: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22286 1726882793.23576: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22286 1726882793.23761: variable '__network_packages_default_gobject_packages' from source: role '' defaults 22286 1726882793.23865: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22286 1726882793.23888: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22286 1726882793.23908: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882793.23944: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22286 1726882793.23957: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22286 1726882793.24031: variable 'ansible_python' from source: facts 22286 1726882793.24055: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 22286 1726882793.24121: variable '__network_wpa_supplicant_required' from source: role '' defaults 22286 1726882793.24198: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 22286 1726882793.24308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22286 1726882793.24327: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22286 1726882793.24349: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882793.24387: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22286 1726882793.24399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22286 1726882793.24439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22286 1726882793.24463: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22286 1726882793.24490: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882793.24520: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22286 1726882793.24532: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22286 1726882793.24654: variable 'network_connections' from source: task vars 22286 1726882793.24659: variable 'interface' from source: play vars 22286 1726882793.24747: variable 'interface' from source: play vars 22286 1726882793.24809: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22286 1726882793.24833: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22286 1726882793.24860: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882793.24888: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22286 1726882793.24928: variable '__network_wireless_connections_defined' from source: role '' defaults 22286 1726882793.25163: variable 'network_connections' from source: task vars 22286 1726882793.25168: variable 'interface' from source: play vars 22286 1726882793.25254: variable 'interface' from source: play vars 22286 1726882793.25300: variable '__network_packages_default_wireless' from source: role '' defaults 22286 1726882793.25368: variable '__network_wireless_connections_defined' from source: role '' defaults 22286 1726882793.25619: variable 'network_connections' from source: task vars 22286 1726882793.25623: variable 'interface' from source: play vars 22286 1726882793.25681: variable 'interface' from source: play vars 22286 1726882793.25706: variable '__network_packages_default_team' from source: role '' defaults 22286 1726882793.25771: variable '__network_team_connections_defined' from source: role '' defaults 22286 1726882793.26028: variable 'network_connections' from source: task vars 22286 1726882793.26033: variable 'interface' from source: play vars 22286 1726882793.26090: variable 'interface' from source: play vars 22286 1726882793.26146: variable '__network_service_name_default_initscripts' from source: role '' defaults 22286 1726882793.26198: variable '__network_service_name_default_initscripts' from source: role '' defaults 22286 1726882793.26204: variable '__network_packages_default_initscripts' from source: role '' defaults 22286 1726882793.26259: variable '__network_packages_default_initscripts' from source: role '' defaults 22286 1726882793.26444: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 22286 1726882793.26833: variable 'network_connections' from source: task vars 22286 1726882793.26838: variable 'interface' from source: play vars 22286 1726882793.26894: variable 'interface' from source: play vars 22286 1726882793.26904: variable 'ansible_distribution' from source: facts 22286 1726882793.26909: variable '__network_rh_distros' from source: role '' defaults 22286 1726882793.26915: variable 'ansible_distribution_major_version' from source: facts 22286 1726882793.26936: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 22286 1726882793.27075: variable 'ansible_distribution' from source: facts 22286 1726882793.27081: variable '__network_rh_distros' from source: role '' defaults 22286 1726882793.27088: variable 'ansible_distribution_major_version' from source: facts 22286 1726882793.27097: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 22286 1726882793.27236: variable 'ansible_distribution' from source: facts 22286 1726882793.27240: variable '__network_rh_distros' from source: role '' defaults 22286 1726882793.27247: variable 'ansible_distribution_major_version' from source: facts 22286 1726882793.27276: variable 'network_provider' from source: set_fact 22286 1726882793.27292: variable 'ansible_facts' from source: unknown 22286 1726882793.27873: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 22286 1726882793.27876: when evaluation is False, skipping this task 22286 1726882793.27882: _execute() done 22286 1726882793.27885: dumping result to json 22286 1726882793.27890: done dumping result, returning 22286 1726882793.27897: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [0affe814-3a2d-a75d-4836-00000000001f] 22286 1726882793.27902: sending task result for task 0affe814-3a2d-a75d-4836-00000000001f 22286 1726882793.27995: done sending task result for task 0affe814-3a2d-a75d-4836-00000000001f 22286 1726882793.27997: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 22286 1726882793.28055: no more pending results, returning what we have 22286 1726882793.28058: results queue empty 22286 1726882793.28059: checking for any_errors_fatal 22286 1726882793.28066: done checking for any_errors_fatal 22286 1726882793.28067: checking for max_fail_percentage 22286 1726882793.28069: done checking for max_fail_percentage 22286 1726882793.28070: checking to see if all hosts have failed and the running result is not ok 22286 1726882793.28071: done checking to see if all hosts have failed 22286 1726882793.28072: getting the remaining hosts for this loop 22286 1726882793.28075: done getting the remaining hosts for this loop 22286 1726882793.28079: getting the next task for host managed_node3 22286 1726882793.28088: done getting next task for host managed_node3 22286 1726882793.28092: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 22286 1726882793.28095: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882793.28111: getting variables 22286 1726882793.28113: in VariableManager get_vars() 22286 1726882793.28164: Calling all_inventory to load vars for managed_node3 22286 1726882793.28167: Calling groups_inventory to load vars for managed_node3 22286 1726882793.28170: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882793.28179: Calling all_plugins_play to load vars for managed_node3 22286 1726882793.28183: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882793.28187: Calling groups_plugins_play to load vars for managed_node3 22286 1726882793.29561: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882793.31205: done with get_vars() 22286 1726882793.31228: done getting variables 22286 1726882793.31279: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:39:53 -0400 (0:00:00.119) 0:00:16.706 ****** 22286 1726882793.31307: entering _queue_task() for managed_node3/package 22286 1726882793.31552: worker is 1 (out of 1 available) 22286 1726882793.31566: exiting _queue_task() for managed_node3/package 22286 1726882793.31581: done queuing things up, now waiting for results queue to drain 22286 1726882793.31582: waiting for pending results... 22286 1726882793.31766: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 22286 1726882793.31871: in run() - task 0affe814-3a2d-a75d-4836-000000000020 22286 1726882793.31886: variable 'ansible_search_path' from source: unknown 22286 1726882793.31890: variable 'ansible_search_path' from source: unknown 22286 1726882793.31927: calling self._execute() 22286 1726882793.32008: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882793.32015: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882793.32029: variable 'omit' from source: magic vars 22286 1726882793.32359: variable 'ansible_distribution_major_version' from source: facts 22286 1726882793.32372: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882793.32479: variable 'network_state' from source: role '' defaults 22286 1726882793.32490: Evaluated conditional (network_state != {}): False 22286 1726882793.32493: when evaluation is False, skipping this task 22286 1726882793.32496: _execute() done 22286 1726882793.32501: dumping result to json 22286 1726882793.32506: done dumping result, returning 22286 1726882793.32513: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affe814-3a2d-a75d-4836-000000000020] 22286 1726882793.32520: sending task result for task 0affe814-3a2d-a75d-4836-000000000020 22286 1726882793.32621: done sending task result for task 0affe814-3a2d-a75d-4836-000000000020 22286 1726882793.32624: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 22286 1726882793.32678: no more pending results, returning what we have 22286 1726882793.32682: results queue empty 22286 1726882793.32683: checking for any_errors_fatal 22286 1726882793.32690: done checking for any_errors_fatal 22286 1726882793.32691: checking for max_fail_percentage 22286 1726882793.32693: done checking for max_fail_percentage 22286 1726882793.32694: checking to see if all hosts have failed and the running result is not ok 22286 1726882793.32695: done checking to see if all hosts have failed 22286 1726882793.32696: getting the remaining hosts for this loop 22286 1726882793.32698: done getting the remaining hosts for this loop 22286 1726882793.32702: getting the next task for host managed_node3 22286 1726882793.32709: done getting next task for host managed_node3 22286 1726882793.32713: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 22286 1726882793.32716: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882793.32732: getting variables 22286 1726882793.32736: in VariableManager get_vars() 22286 1726882793.32774: Calling all_inventory to load vars for managed_node3 22286 1726882793.32776: Calling groups_inventory to load vars for managed_node3 22286 1726882793.32779: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882793.32789: Calling all_plugins_play to load vars for managed_node3 22286 1726882793.32792: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882793.32795: Calling groups_plugins_play to load vars for managed_node3 22286 1726882793.34600: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882793.36379: done with get_vars() 22286 1726882793.36411: done getting variables 22286 1726882793.36489: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:39:53 -0400 (0:00:00.052) 0:00:16.758 ****** 22286 1726882793.36522: entering _queue_task() for managed_node3/package 22286 1726882793.36842: worker is 1 (out of 1 available) 22286 1726882793.36853: exiting _queue_task() for managed_node3/package 22286 1726882793.36866: done queuing things up, now waiting for results queue to drain 22286 1726882793.36867: waiting for pending results... 22286 1726882793.37407: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 22286 1726882793.37412: in run() - task 0affe814-3a2d-a75d-4836-000000000021 22286 1726882793.37416: variable 'ansible_search_path' from source: unknown 22286 1726882793.37419: variable 'ansible_search_path' from source: unknown 22286 1726882793.37422: calling self._execute() 22286 1726882793.37503: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882793.37507: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882793.37511: variable 'omit' from source: magic vars 22286 1726882793.37939: variable 'ansible_distribution_major_version' from source: facts 22286 1726882793.37949: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882793.38117: variable 'network_state' from source: role '' defaults 22286 1726882793.38130: Evaluated conditional (network_state != {}): False 22286 1726882793.38135: when evaluation is False, skipping this task 22286 1726882793.38139: _execute() done 22286 1726882793.38142: dumping result to json 22286 1726882793.38160: done dumping result, returning 22286 1726882793.38164: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affe814-3a2d-a75d-4836-000000000021] 22286 1726882793.38167: sending task result for task 0affe814-3a2d-a75d-4836-000000000021 22286 1726882793.38310: done sending task result for task 0affe814-3a2d-a75d-4836-000000000021 22286 1726882793.38312: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 22286 1726882793.38385: no more pending results, returning what we have 22286 1726882793.38390: results queue empty 22286 1726882793.38391: checking for any_errors_fatal 22286 1726882793.38402: done checking for any_errors_fatal 22286 1726882793.38403: checking for max_fail_percentage 22286 1726882793.38406: done checking for max_fail_percentage 22286 1726882793.38407: checking to see if all hosts have failed and the running result is not ok 22286 1726882793.38409: done checking to see if all hosts have failed 22286 1726882793.38410: getting the remaining hosts for this loop 22286 1726882793.38412: done getting the remaining hosts for this loop 22286 1726882793.38418: getting the next task for host managed_node3 22286 1726882793.38425: done getting next task for host managed_node3 22286 1726882793.38430: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 22286 1726882793.38436: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882793.38456: getting variables 22286 1726882793.38458: in VariableManager get_vars() 22286 1726882793.38508: Calling all_inventory to load vars for managed_node3 22286 1726882793.38511: Calling groups_inventory to load vars for managed_node3 22286 1726882793.38514: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882793.38527: Calling all_plugins_play to load vars for managed_node3 22286 1726882793.38531: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882793.38640: Calling groups_plugins_play to load vars for managed_node3 22286 1726882793.40838: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882793.43793: done with get_vars() 22286 1726882793.43831: done getting variables 22286 1726882793.43957: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:39:53 -0400 (0:00:00.074) 0:00:16.833 ****** 22286 1726882793.43997: entering _queue_task() for managed_node3/service 22286 1726882793.43999: Creating lock for service 22286 1726882793.44342: worker is 1 (out of 1 available) 22286 1726882793.44358: exiting _queue_task() for managed_node3/service 22286 1726882793.44374: done queuing things up, now waiting for results queue to drain 22286 1726882793.44378: waiting for pending results... 22286 1726882793.44807: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 22286 1726882793.44833: in run() - task 0affe814-3a2d-a75d-4836-000000000022 22286 1726882793.44851: variable 'ansible_search_path' from source: unknown 22286 1726882793.44855: variable 'ansible_search_path' from source: unknown 22286 1726882793.44919: calling self._execute() 22286 1726882793.45009: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882793.45027: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882793.45031: variable 'omit' from source: magic vars 22286 1726882793.45616: variable 'ansible_distribution_major_version' from source: facts 22286 1726882793.45620: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882793.45655: variable '__network_wireless_connections_defined' from source: role '' defaults 22286 1726882793.45922: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22286 1726882793.48580: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22286 1726882793.49046: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22286 1726882793.49095: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22286 1726882793.49136: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22286 1726882793.49165: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22286 1726882793.49259: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22286 1726882793.49298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22286 1726882793.49336: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882793.49388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22286 1726882793.49405: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22286 1726882793.49469: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22286 1726882793.49500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22286 1726882793.49536: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882793.49588: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22286 1726882793.49605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22286 1726882793.49661: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22286 1726882793.49693: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22286 1726882793.49723: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882793.49852: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22286 1726882793.49855: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22286 1726882793.50025: variable 'network_connections' from source: task vars 22286 1726882793.50041: variable 'interface' from source: play vars 22286 1726882793.50144: variable 'interface' from source: play vars 22286 1726882793.50241: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22286 1726882793.50456: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22286 1726882793.50507: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22286 1726882793.50545: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22286 1726882793.50578: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22286 1726882793.50653: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22286 1726882793.50720: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22286 1726882793.50724: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882793.50750: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22286 1726882793.50819: variable '__network_team_connections_defined' from source: role '' defaults 22286 1726882793.51169: variable 'network_connections' from source: task vars 22286 1726882793.51175: variable 'interface' from source: play vars 22286 1726882793.51252: variable 'interface' from source: play vars 22286 1726882793.51339: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 22286 1726882793.51342: when evaluation is False, skipping this task 22286 1726882793.51345: _execute() done 22286 1726882793.51347: dumping result to json 22286 1726882793.51350: done dumping result, returning 22286 1726882793.51352: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affe814-3a2d-a75d-4836-000000000022] 22286 1726882793.51354: sending task result for task 0affe814-3a2d-a75d-4836-000000000022 22286 1726882793.51433: done sending task result for task 0affe814-3a2d-a75d-4836-000000000022 skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 22286 1726882793.51599: no more pending results, returning what we have 22286 1726882793.51603: results queue empty 22286 1726882793.51604: checking for any_errors_fatal 22286 1726882793.51612: done checking for any_errors_fatal 22286 1726882793.51613: checking for max_fail_percentage 22286 1726882793.51616: done checking for max_fail_percentage 22286 1726882793.51617: checking to see if all hosts have failed and the running result is not ok 22286 1726882793.51618: done checking to see if all hosts have failed 22286 1726882793.51619: getting the remaining hosts for this loop 22286 1726882793.51621: done getting the remaining hosts for this loop 22286 1726882793.51626: getting the next task for host managed_node3 22286 1726882793.51633: done getting next task for host managed_node3 22286 1726882793.51640: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 22286 1726882793.51644: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882793.51666: getting variables 22286 1726882793.51668: in VariableManager get_vars() 22286 1726882793.51719: Calling all_inventory to load vars for managed_node3 22286 1726882793.51723: Calling groups_inventory to load vars for managed_node3 22286 1726882793.51726: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882793.51740: Calling all_plugins_play to load vars for managed_node3 22286 1726882793.51745: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882793.51750: Calling groups_plugins_play to load vars for managed_node3 22286 1726882793.52268: WORKER PROCESS EXITING 22286 1726882793.54857: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882793.57883: done with get_vars() 22286 1726882793.57918: done getting variables 22286 1726882793.57990: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:39:53 -0400 (0:00:00.140) 0:00:16.973 ****** 22286 1726882793.58027: entering _queue_task() for managed_node3/service 22286 1726882793.58365: worker is 1 (out of 1 available) 22286 1726882793.58379: exiting _queue_task() for managed_node3/service 22286 1726882793.58392: done queuing things up, now waiting for results queue to drain 22286 1726882793.58394: waiting for pending results... 22286 1726882793.58764: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 22286 1726882793.58849: in run() - task 0affe814-3a2d-a75d-4836-000000000023 22286 1726882793.58878: variable 'ansible_search_path' from source: unknown 22286 1726882793.58892: variable 'ansible_search_path' from source: unknown 22286 1726882793.58938: calling self._execute() 22286 1726882793.59051: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882793.59065: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882793.59103: variable 'omit' from source: magic vars 22286 1726882793.59547: variable 'ansible_distribution_major_version' from source: facts 22286 1726882793.59623: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882793.59791: variable 'network_provider' from source: set_fact 22286 1726882793.59804: variable 'network_state' from source: role '' defaults 22286 1726882793.59821: Evaluated conditional (network_provider == "nm" or network_state != {}): True 22286 1726882793.59839: variable 'omit' from source: magic vars 22286 1726882793.59920: variable 'omit' from source: magic vars 22286 1726882793.59970: variable 'network_service_name' from source: role '' defaults 22286 1726882793.60065: variable 'network_service_name' from source: role '' defaults 22286 1726882793.60198: variable '__network_provider_setup' from source: role '' defaults 22286 1726882793.60238: variable '__network_service_name_default_nm' from source: role '' defaults 22286 1726882793.60298: variable '__network_service_name_default_nm' from source: role '' defaults 22286 1726882793.60314: variable '__network_packages_default_nm' from source: role '' defaults 22286 1726882793.60407: variable '__network_packages_default_nm' from source: role '' defaults 22286 1726882793.60762: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22286 1726882793.63753: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22286 1726882793.63942: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22286 1726882793.63946: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22286 1726882793.63960: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22286 1726882793.63996: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22286 1726882793.64105: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22286 1726882793.64153: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22286 1726882793.64203: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882793.64273: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22286 1726882793.64340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22286 1726882793.64367: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22286 1726882793.64415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22286 1726882793.64456: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882793.64520: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22286 1726882793.64545: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22286 1726882793.65044: variable '__network_packages_default_gobject_packages' from source: role '' defaults 22286 1726882793.65048: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22286 1726882793.65253: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22286 1726882793.65300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882793.65360: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22286 1726882793.65540: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22286 1726882793.65680: variable 'ansible_python' from source: facts 22286 1726882793.65922: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 22286 1726882793.66058: variable '__network_wpa_supplicant_required' from source: role '' defaults 22286 1726882793.66285: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 22286 1726882793.66560: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22286 1726882793.66658: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22286 1726882793.66767: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882793.66925: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22286 1726882793.66959: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22286 1726882793.67171: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22286 1726882793.67192: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22286 1726882793.67230: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882793.67443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22286 1726882793.67446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22286 1726882793.67860: variable 'network_connections' from source: task vars 22286 1726882793.67990: variable 'interface' from source: play vars 22286 1726882793.68106: variable 'interface' from source: play vars 22286 1726882793.68398: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22286 1726882793.68812: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22286 1726882793.68873: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22286 1726882793.68926: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22286 1726882793.68976: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22286 1726882793.69051: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22286 1726882793.69088: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22286 1726882793.69132: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882793.69169: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22286 1726882793.69221: variable '__network_wireless_connections_defined' from source: role '' defaults 22286 1726882793.69604: variable 'network_connections' from source: task vars 22286 1726882793.69620: variable 'interface' from source: play vars 22286 1726882793.69730: variable 'interface' from source: play vars 22286 1726882793.69761: variable '__network_packages_default_wireless' from source: role '' defaults 22286 1726882793.69864: variable '__network_wireless_connections_defined' from source: role '' defaults 22286 1726882793.70252: variable 'network_connections' from source: task vars 22286 1726882793.70277: variable 'interface' from source: play vars 22286 1726882793.70345: variable 'interface' from source: play vars 22286 1726882793.70386: variable '__network_packages_default_team' from source: role '' defaults 22286 1726882793.70477: variable '__network_team_connections_defined' from source: role '' defaults 22286 1726882793.70931: variable 'network_connections' from source: task vars 22286 1726882793.70936: variable 'interface' from source: play vars 22286 1726882793.70963: variable 'interface' from source: play vars 22286 1726882793.71039: variable '__network_service_name_default_initscripts' from source: role '' defaults 22286 1726882793.71117: variable '__network_service_name_default_initscripts' from source: role '' defaults 22286 1726882793.71125: variable '__network_packages_default_initscripts' from source: role '' defaults 22286 1726882793.71202: variable '__network_packages_default_initscripts' from source: role '' defaults 22286 1726882793.71510: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 22286 1726882793.72161: variable 'network_connections' from source: task vars 22286 1726882793.72166: variable 'interface' from source: play vars 22286 1726882793.72243: variable 'interface' from source: play vars 22286 1726882793.72389: variable 'ansible_distribution' from source: facts 22286 1726882793.72392: variable '__network_rh_distros' from source: role '' defaults 22286 1726882793.72395: variable 'ansible_distribution_major_version' from source: facts 22286 1726882793.72397: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 22286 1726882793.72528: variable 'ansible_distribution' from source: facts 22286 1726882793.72532: variable '__network_rh_distros' from source: role '' defaults 22286 1726882793.72541: variable 'ansible_distribution_major_version' from source: facts 22286 1726882793.72550: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 22286 1726882793.72775: variable 'ansible_distribution' from source: facts 22286 1726882793.72804: variable '__network_rh_distros' from source: role '' defaults 22286 1726882793.72808: variable 'ansible_distribution_major_version' from source: facts 22286 1726882793.72831: variable 'network_provider' from source: set_fact 22286 1726882793.72860: variable 'omit' from source: magic vars 22286 1726882793.72895: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22286 1726882793.72927: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22286 1726882793.72962: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22286 1726882793.72971: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882793.72987: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882793.73028: variable 'inventory_hostname' from source: host vars for 'managed_node3' 22286 1726882793.73031: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882793.73033: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882793.73179: Set connection var ansible_shell_executable to /bin/sh 22286 1726882793.73183: Set connection var ansible_module_compression to ZIP_DEFLATED 22286 1726882793.73185: Set connection var ansible_connection to ssh 22286 1726882793.73188: Set connection var ansible_shell_type to sh 22286 1726882793.73247: Set connection var ansible_timeout to 10 22286 1726882793.73250: Set connection var ansible_pipelining to False 22286 1726882793.73253: variable 'ansible_shell_executable' from source: unknown 22286 1726882793.73256: variable 'ansible_connection' from source: unknown 22286 1726882793.73258: variable 'ansible_module_compression' from source: unknown 22286 1726882793.73260: variable 'ansible_shell_type' from source: unknown 22286 1726882793.73263: variable 'ansible_shell_executable' from source: unknown 22286 1726882793.73265: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882793.73267: variable 'ansible_pipelining' from source: unknown 22286 1726882793.73269: variable 'ansible_timeout' from source: unknown 22286 1726882793.73274: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882793.73398: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22286 1726882793.73462: variable 'omit' from source: magic vars 22286 1726882793.73470: starting attempt loop 22286 1726882793.73473: running the handler 22286 1726882793.73512: variable 'ansible_facts' from source: unknown 22286 1726882793.74772: _low_level_execute_command(): starting 22286 1726882793.74776: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22286 1726882793.75452: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882793.75508: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882793.75521: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882793.75566: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882793.75686: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882793.77549: stdout chunk (state=3): >>>/root <<< 22286 1726882793.77749: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882793.77753: stdout chunk (state=3): >>><<< 22286 1726882793.77755: stderr chunk (state=3): >>><<< 22286 1726882793.77872: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882793.77875: _low_level_execute_command(): starting 22286 1726882793.77879: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882793.7777784-22886-178408534144218 `" && echo ansible-tmp-1726882793.7777784-22886-178408534144218="` echo /root/.ansible/tmp/ansible-tmp-1726882793.7777784-22886-178408534144218 `" ) && sleep 0' 22286 1726882793.78424: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882793.78443: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882793.78460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882793.78490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22286 1726882793.78510: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 <<< 22286 1726882793.78524: stderr chunk (state=3): >>>debug2: match not found <<< 22286 1726882793.78548: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882793.78592: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882793.78667: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882793.78700: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882793.78717: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882793.78864: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882793.80995: stdout chunk (state=3): >>>ansible-tmp-1726882793.7777784-22886-178408534144218=/root/.ansible/tmp/ansible-tmp-1726882793.7777784-22886-178408534144218 <<< 22286 1726882793.81108: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882793.81164: stderr chunk (state=3): >>><<< 22286 1726882793.81172: stdout chunk (state=3): >>><<< 22286 1726882793.81190: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882793.7777784-22886-178408534144218=/root/.ansible/tmp/ansible-tmp-1726882793.7777784-22886-178408534144218 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882793.81227: variable 'ansible_module_compression' from source: unknown 22286 1726882793.81277: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 22286 1726882793.81281: ANSIBALLZ: Acquiring lock 22286 1726882793.81285: ANSIBALLZ: Lock acquired: 140212085117232 22286 1726882793.81290: ANSIBALLZ: Creating module 22286 1726882794.07150: ANSIBALLZ: Writing module into payload 22286 1726882794.07295: ANSIBALLZ: Writing module 22286 1726882794.07325: ANSIBALLZ: Renaming module 22286 1726882794.07329: ANSIBALLZ: Done creating module 22286 1726882794.07350: variable 'ansible_facts' from source: unknown 22286 1726882794.07474: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882793.7777784-22886-178408534144218/AnsiballZ_systemd.py 22286 1726882794.07600: Sending initial data 22286 1726882794.07604: Sent initial data (156 bytes) 22286 1726882794.08100: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882794.08107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found <<< 22286 1726882794.08110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882794.08113: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882794.08178: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882794.08185: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882794.08186: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882794.08311: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882794.10168: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 22286 1726882794.10172: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22286 1726882794.10282: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22286 1726882794.10402: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22286nvkga28q/tmpdl8tkqqi /root/.ansible/tmp/ansible-tmp-1726882793.7777784-22886-178408534144218/AnsiballZ_systemd.py <<< 22286 1726882794.10406: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882793.7777784-22886-178408534144218/AnsiballZ_systemd.py" <<< 22286 1726882794.10514: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-22286nvkga28q/tmpdl8tkqqi" to remote "/root/.ansible/tmp/ansible-tmp-1726882793.7777784-22886-178408534144218/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882793.7777784-22886-178408534144218/AnsiballZ_systemd.py" <<< 22286 1726882794.12600: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882794.12664: stderr chunk (state=3): >>><<< 22286 1726882794.12667: stdout chunk (state=3): >>><<< 22286 1726882794.12689: done transferring module to remote 22286 1726882794.12699: _low_level_execute_command(): starting 22286 1726882794.12704: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882793.7777784-22886-178408534144218/ /root/.ansible/tmp/ansible-tmp-1726882793.7777784-22886-178408534144218/AnsiballZ_systemd.py && sleep 0' 22286 1726882794.13399: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882794.13420: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882794.13439: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882794.13463: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882794.13608: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882794.15645: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882794.15694: stderr chunk (state=3): >>><<< 22286 1726882794.15697: stdout chunk (state=3): >>><<< 22286 1726882794.15711: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882794.15715: _low_level_execute_command(): starting 22286 1726882794.15720: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882793.7777784-22886-178408534144218/AnsiballZ_systemd.py && sleep 0' 22286 1726882794.16340: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882794.16344: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882794.16347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882794.16350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22286 1726882794.16352: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 <<< 22286 1726882794.16445: stderr chunk (state=3): >>>debug2: match not found <<< 22286 1726882794.16448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882794.16450: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 22286 1726882794.16453: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.41.238 is address <<< 22286 1726882794.16455: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 22286 1726882794.16457: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882794.16459: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882794.16473: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22286 1726882794.16475: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 <<< 22286 1726882794.16478: stderr chunk (state=3): >>>debug2: match found <<< 22286 1726882794.16480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882794.16582: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882794.16585: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882794.16740: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882794.16893: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882794.50301: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "653", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:24:33 EDT", "ExecMainStartTimestampMonotonic": "18094121", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "653", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3421", "MemoryCurrent": "11964416", "MemoryAvailable": "infinity", "CPUUsageNSec": "1942622000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "in<<< 22286 1726882794.50336: stdout chunk (state=3): >>>finity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service shutdown.target multi-user.target network.target cloud-init.service", "After": "systemd-journald.socket dbus-broker.service dbus.socket basic.target network-pre.target system.slice sysinit.target cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "l<<< 22286 1726882794.50345: stdout chunk (state=3): >>>oaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:41 EDT", "StateChangeTimestampMonotonic": "505811565", "InactiveExitTimestamp": "Fri 2024-09-20 21:24:33 EDT", "InactiveExitTimestampMonotonic": "18094364", "ActiveEnterTimestamp": "Fri 2024-09-20 21:24:34 EDT", "ActiveEnterTimestampMonotonic": "18531095", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:24:33 EDT", "ConditionTimestampMonotonic": "18086405", "AssertTimestamp": "Fri 2024-09-20 21:24:33 EDT", "AssertTimestampMonotonic": "18086408", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "1c8adba7025b47b4adeb74e368331c9f", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 22286 1726882794.52374: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. <<< 22286 1726882794.52425: stderr chunk (state=3): >>><<< 22286 1726882794.52428: stdout chunk (state=3): >>><<< 22286 1726882794.52446: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "653", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:24:33 EDT", "ExecMainStartTimestampMonotonic": "18094121", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "653", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3421", "MemoryCurrent": "11964416", "MemoryAvailable": "infinity", "CPUUsageNSec": "1942622000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service shutdown.target multi-user.target network.target cloud-init.service", "After": "systemd-journald.socket dbus-broker.service dbus.socket basic.target network-pre.target system.slice sysinit.target cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:41 EDT", "StateChangeTimestampMonotonic": "505811565", "InactiveExitTimestamp": "Fri 2024-09-20 21:24:33 EDT", "InactiveExitTimestampMonotonic": "18094364", "ActiveEnterTimestamp": "Fri 2024-09-20 21:24:34 EDT", "ActiveEnterTimestampMonotonic": "18531095", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:24:33 EDT", "ConditionTimestampMonotonic": "18086405", "AssertTimestamp": "Fri 2024-09-20 21:24:33 EDT", "AssertTimestampMonotonic": "18086408", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "1c8adba7025b47b4adeb74e368331c9f", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. 22286 1726882794.52617: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882793.7777784-22886-178408534144218/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22286 1726882794.52624: _low_level_execute_command(): starting 22286 1726882794.52631: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882793.7777784-22886-178408534144218/ > /dev/null 2>&1 && sleep 0' 22286 1726882794.53083: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882794.53087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found <<< 22286 1726882794.53089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 22286 1726882794.53092: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22286 1726882794.53094: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882794.53142: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882794.53150: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882794.53266: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882794.55296: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882794.55343: stderr chunk (state=3): >>><<< 22286 1726882794.55347: stdout chunk (state=3): >>><<< 22286 1726882794.55361: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882794.55369: handler run complete 22286 1726882794.55419: attempt loop complete, returning result 22286 1726882794.55423: _execute() done 22286 1726882794.55425: dumping result to json 22286 1726882794.55444: done dumping result, returning 22286 1726882794.55453: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affe814-3a2d-a75d-4836-000000000023] 22286 1726882794.55460: sending task result for task 0affe814-3a2d-a75d-4836-000000000023 22286 1726882794.55728: done sending task result for task 0affe814-3a2d-a75d-4836-000000000023 22286 1726882794.55730: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 22286 1726882794.55796: no more pending results, returning what we have 22286 1726882794.55799: results queue empty 22286 1726882794.55800: checking for any_errors_fatal 22286 1726882794.55806: done checking for any_errors_fatal 22286 1726882794.55807: checking for max_fail_percentage 22286 1726882794.55809: done checking for max_fail_percentage 22286 1726882794.55810: checking to see if all hosts have failed and the running result is not ok 22286 1726882794.55811: done checking to see if all hosts have failed 22286 1726882794.55812: getting the remaining hosts for this loop 22286 1726882794.55814: done getting the remaining hosts for this loop 22286 1726882794.55819: getting the next task for host managed_node3 22286 1726882794.55825: done getting next task for host managed_node3 22286 1726882794.55830: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 22286 1726882794.55837: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882794.55856: getting variables 22286 1726882794.55858: in VariableManager get_vars() 22286 1726882794.55900: Calling all_inventory to load vars for managed_node3 22286 1726882794.55902: Calling groups_inventory to load vars for managed_node3 22286 1726882794.55905: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882794.55915: Calling all_plugins_play to load vars for managed_node3 22286 1726882794.55918: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882794.55922: Calling groups_plugins_play to load vars for managed_node3 22286 1726882794.57281: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882794.58855: done with get_vars() 22286 1726882794.58881: done getting variables 22286 1726882794.58932: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:39:54 -0400 (0:00:01.009) 0:00:17.983 ****** 22286 1726882794.58960: entering _queue_task() for managed_node3/service 22286 1726882794.59208: worker is 1 (out of 1 available) 22286 1726882794.59221: exiting _queue_task() for managed_node3/service 22286 1726882794.59237: done queuing things up, now waiting for results queue to drain 22286 1726882794.59239: waiting for pending results... 22286 1726882794.59425: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 22286 1726882794.59522: in run() - task 0affe814-3a2d-a75d-4836-000000000024 22286 1726882794.59537: variable 'ansible_search_path' from source: unknown 22286 1726882794.59541: variable 'ansible_search_path' from source: unknown 22286 1726882794.59581: calling self._execute() 22286 1726882794.59659: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882794.59665: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882794.59680: variable 'omit' from source: magic vars 22286 1726882794.59996: variable 'ansible_distribution_major_version' from source: facts 22286 1726882794.60007: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882794.60110: variable 'network_provider' from source: set_fact 22286 1726882794.60114: Evaluated conditional (network_provider == "nm"): True 22286 1726882794.60197: variable '__network_wpa_supplicant_required' from source: role '' defaults 22286 1726882794.60274: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 22286 1726882794.60419: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22286 1726882794.62046: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22286 1726882794.62102: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22286 1726882794.62132: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22286 1726882794.62163: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22286 1726882794.62188: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22286 1726882794.62265: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22286 1726882794.62290: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22286 1726882794.62315: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882794.62352: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22286 1726882794.62365: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22286 1726882794.62408: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22286 1726882794.62437: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22286 1726882794.62457: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882794.62492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22286 1726882794.62504: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22286 1726882794.62545: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22286 1726882794.62565: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22286 1726882794.62588: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882794.62618: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22286 1726882794.62631: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22286 1726882794.62757: variable 'network_connections' from source: task vars 22286 1726882794.62760: variable 'interface' from source: play vars 22286 1726882794.62820: variable 'interface' from source: play vars 22286 1726882794.62889: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22286 1726882794.63019: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22286 1726882794.63052: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22286 1726882794.63084: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22286 1726882794.63109: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22286 1726882794.63146: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22286 1726882794.63164: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22286 1726882794.63193: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882794.63212: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22286 1726882794.63254: variable '__network_wireless_connections_defined' from source: role '' defaults 22286 1726882794.63459: variable 'network_connections' from source: task vars 22286 1726882794.63463: variable 'interface' from source: play vars 22286 1726882794.63516: variable 'interface' from source: play vars 22286 1726882794.63556: Evaluated conditional (__network_wpa_supplicant_required): False 22286 1726882794.63560: when evaluation is False, skipping this task 22286 1726882794.63563: _execute() done 22286 1726882794.63568: dumping result to json 22286 1726882794.63572: done dumping result, returning 22286 1726882794.63580: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affe814-3a2d-a75d-4836-000000000024] 22286 1726882794.63590: sending task result for task 0affe814-3a2d-a75d-4836-000000000024 22286 1726882794.63672: done sending task result for task 0affe814-3a2d-a75d-4836-000000000024 22286 1726882794.63678: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 22286 1726882794.63733: no more pending results, returning what we have 22286 1726882794.63739: results queue empty 22286 1726882794.63740: checking for any_errors_fatal 22286 1726882794.63770: done checking for any_errors_fatal 22286 1726882794.63771: checking for max_fail_percentage 22286 1726882794.63773: done checking for max_fail_percentage 22286 1726882794.63774: checking to see if all hosts have failed and the running result is not ok 22286 1726882794.63777: done checking to see if all hosts have failed 22286 1726882794.63778: getting the remaining hosts for this loop 22286 1726882794.63780: done getting the remaining hosts for this loop 22286 1726882794.63784: getting the next task for host managed_node3 22286 1726882794.63791: done getting next task for host managed_node3 22286 1726882794.63797: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 22286 1726882794.63800: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882794.63816: getting variables 22286 1726882794.63818: in VariableManager get_vars() 22286 1726882794.63857: Calling all_inventory to load vars for managed_node3 22286 1726882794.63860: Calling groups_inventory to load vars for managed_node3 22286 1726882794.63862: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882794.63871: Calling all_plugins_play to load vars for managed_node3 22286 1726882794.63873: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882794.63879: Calling groups_plugins_play to load vars for managed_node3 22286 1726882794.65099: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882794.66669: done with get_vars() 22286 1726882794.66694: done getting variables 22286 1726882794.66745: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:39:54 -0400 (0:00:00.078) 0:00:18.061 ****** 22286 1726882794.66770: entering _queue_task() for managed_node3/service 22286 1726882794.67006: worker is 1 (out of 1 available) 22286 1726882794.67020: exiting _queue_task() for managed_node3/service 22286 1726882794.67033: done queuing things up, now waiting for results queue to drain 22286 1726882794.67037: waiting for pending results... 22286 1726882794.67212: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 22286 1726882794.67314: in run() - task 0affe814-3a2d-a75d-4836-000000000025 22286 1726882794.67328: variable 'ansible_search_path' from source: unknown 22286 1726882794.67331: variable 'ansible_search_path' from source: unknown 22286 1726882794.67367: calling self._execute() 22286 1726882794.67451: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882794.67458: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882794.67469: variable 'omit' from source: magic vars 22286 1726882794.67782: variable 'ansible_distribution_major_version' from source: facts 22286 1726882794.67791: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882794.67890: variable 'network_provider' from source: set_fact 22286 1726882794.67896: Evaluated conditional (network_provider == "initscripts"): False 22286 1726882794.67899: when evaluation is False, skipping this task 22286 1726882794.67904: _execute() done 22286 1726882794.67906: dumping result to json 22286 1726882794.67913: done dumping result, returning 22286 1726882794.67920: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [0affe814-3a2d-a75d-4836-000000000025] 22286 1726882794.67932: sending task result for task 0affe814-3a2d-a75d-4836-000000000025 22286 1726882794.68018: done sending task result for task 0affe814-3a2d-a75d-4836-000000000025 22286 1726882794.68021: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 22286 1726882794.68080: no more pending results, returning what we have 22286 1726882794.68084: results queue empty 22286 1726882794.68085: checking for any_errors_fatal 22286 1726882794.68094: done checking for any_errors_fatal 22286 1726882794.68095: checking for max_fail_percentage 22286 1726882794.68097: done checking for max_fail_percentage 22286 1726882794.68098: checking to see if all hosts have failed and the running result is not ok 22286 1726882794.68099: done checking to see if all hosts have failed 22286 1726882794.68100: getting the remaining hosts for this loop 22286 1726882794.68101: done getting the remaining hosts for this loop 22286 1726882794.68105: getting the next task for host managed_node3 22286 1726882794.68112: done getting next task for host managed_node3 22286 1726882794.68117: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 22286 1726882794.68120: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882794.68143: getting variables 22286 1726882794.68145: in VariableManager get_vars() 22286 1726882794.68185: Calling all_inventory to load vars for managed_node3 22286 1726882794.68188: Calling groups_inventory to load vars for managed_node3 22286 1726882794.68191: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882794.68201: Calling all_plugins_play to load vars for managed_node3 22286 1726882794.68204: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882794.68206: Calling groups_plugins_play to load vars for managed_node3 22286 1726882794.70154: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882794.73807: done with get_vars() 22286 1726882794.73848: done getting variables 22286 1726882794.73921: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:39:54 -0400 (0:00:00.071) 0:00:18.133 ****** 22286 1726882794.73962: entering _queue_task() for managed_node3/copy 22286 1726882794.74304: worker is 1 (out of 1 available) 22286 1726882794.74318: exiting _queue_task() for managed_node3/copy 22286 1726882794.74331: done queuing things up, now waiting for results queue to drain 22286 1726882794.74333: waiting for pending results... 22286 1726882794.74669: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 22286 1726882794.74873: in run() - task 0affe814-3a2d-a75d-4836-000000000026 22286 1726882794.74879: variable 'ansible_search_path' from source: unknown 22286 1726882794.74882: variable 'ansible_search_path' from source: unknown 22286 1726882794.74910: calling self._execute() 22286 1726882794.75026: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882794.75042: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882794.75062: variable 'omit' from source: magic vars 22286 1726882794.75556: variable 'ansible_distribution_major_version' from source: facts 22286 1726882794.75569: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882794.75670: variable 'network_provider' from source: set_fact 22286 1726882794.75674: Evaluated conditional (network_provider == "initscripts"): False 22286 1726882794.75681: when evaluation is False, skipping this task 22286 1726882794.75685: _execute() done 22286 1726882794.75688: dumping result to json 22286 1726882794.75691: done dumping result, returning 22286 1726882794.75711: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affe814-3a2d-a75d-4836-000000000026] 22286 1726882794.75715: sending task result for task 0affe814-3a2d-a75d-4836-000000000026 22286 1726882794.75805: done sending task result for task 0affe814-3a2d-a75d-4836-000000000026 22286 1726882794.75810: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 22286 1726882794.75868: no more pending results, returning what we have 22286 1726882794.75873: results queue empty 22286 1726882794.75874: checking for any_errors_fatal 22286 1726882794.75884: done checking for any_errors_fatal 22286 1726882794.75885: checking for max_fail_percentage 22286 1726882794.75887: done checking for max_fail_percentage 22286 1726882794.75888: checking to see if all hosts have failed and the running result is not ok 22286 1726882794.75889: done checking to see if all hosts have failed 22286 1726882794.75890: getting the remaining hosts for this loop 22286 1726882794.75892: done getting the remaining hosts for this loop 22286 1726882794.75896: getting the next task for host managed_node3 22286 1726882794.75903: done getting next task for host managed_node3 22286 1726882794.75907: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 22286 1726882794.75910: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882794.75927: getting variables 22286 1726882794.75928: in VariableManager get_vars() 22286 1726882794.75967: Calling all_inventory to load vars for managed_node3 22286 1726882794.75970: Calling groups_inventory to load vars for managed_node3 22286 1726882794.75973: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882794.75985: Calling all_plugins_play to load vars for managed_node3 22286 1726882794.75988: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882794.75992: Calling groups_plugins_play to load vars for managed_node3 22286 1726882794.77318: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882794.80186: done with get_vars() 22286 1726882794.80225: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:39:54 -0400 (0:00:00.063) 0:00:18.196 ****** 22286 1726882794.80337: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 22286 1726882794.80339: Creating lock for fedora.linux_system_roles.network_connections 22286 1726882794.80716: worker is 1 (out of 1 available) 22286 1726882794.80730: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 22286 1726882794.80747: done queuing things up, now waiting for results queue to drain 22286 1726882794.80748: waiting for pending results... 22286 1726882794.81157: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 22286 1726882794.81228: in run() - task 0affe814-3a2d-a75d-4836-000000000027 22286 1726882794.81258: variable 'ansible_search_path' from source: unknown 22286 1726882794.81266: variable 'ansible_search_path' from source: unknown 22286 1726882794.81321: calling self._execute() 22286 1726882794.81441: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882794.81455: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882794.81481: variable 'omit' from source: magic vars 22286 1726882794.81945: variable 'ansible_distribution_major_version' from source: facts 22286 1726882794.82011: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882794.82014: variable 'omit' from source: magic vars 22286 1726882794.82068: variable 'omit' from source: magic vars 22286 1726882794.82283: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22286 1726882794.84888: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22286 1726882794.84974: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22286 1726882794.85022: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22286 1726882794.85159: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22286 1726882794.85163: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22286 1726882794.85212: variable 'network_provider' from source: set_fact 22286 1726882794.85381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22286 1726882794.85437: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22286 1726882794.85474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882794.85541: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22286 1726882794.85562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22286 1726882794.85661: variable 'omit' from source: magic vars 22286 1726882794.85816: variable 'omit' from source: magic vars 22286 1726882794.85960: variable 'network_connections' from source: task vars 22286 1726882794.85981: variable 'interface' from source: play vars 22286 1726882794.86070: variable 'interface' from source: play vars 22286 1726882794.86289: variable 'omit' from source: magic vars 22286 1726882794.86304: variable '__lsr_ansible_managed' from source: task vars 22286 1726882794.86459: variable '__lsr_ansible_managed' from source: task vars 22286 1726882794.86756: Loaded config def from plugin (lookup/template) 22286 1726882794.86767: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 22286 1726882794.86808: File lookup term: get_ansible_managed.j2 22286 1726882794.86817: variable 'ansible_search_path' from source: unknown 22286 1726882794.86827: evaluation_path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 22286 1726882794.86851: search_path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 22286 1726882794.86878: variable 'ansible_search_path' from source: unknown 22286 1726882794.95842: variable 'ansible_managed' from source: unknown 22286 1726882794.95985: variable 'omit' from source: magic vars 22286 1726882794.96010: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22286 1726882794.96035: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22286 1726882794.96055: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22286 1726882794.96073: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882794.96084: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882794.96110: variable 'inventory_hostname' from source: host vars for 'managed_node3' 22286 1726882794.96114: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882794.96119: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882794.96222: Set connection var ansible_shell_executable to /bin/sh 22286 1726882794.96230: Set connection var ansible_module_compression to ZIP_DEFLATED 22286 1726882794.96236: Set connection var ansible_connection to ssh 22286 1726882794.96239: Set connection var ansible_shell_type to sh 22286 1726882794.96246: Set connection var ansible_timeout to 10 22286 1726882794.96254: Set connection var ansible_pipelining to False 22286 1726882794.96309: variable 'ansible_shell_executable' from source: unknown 22286 1726882794.96313: variable 'ansible_connection' from source: unknown 22286 1726882794.96317: variable 'ansible_module_compression' from source: unknown 22286 1726882794.96320: variable 'ansible_shell_type' from source: unknown 22286 1726882794.96322: variable 'ansible_shell_executable' from source: unknown 22286 1726882794.96325: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882794.96327: variable 'ansible_pipelining' from source: unknown 22286 1726882794.96329: variable 'ansible_timeout' from source: unknown 22286 1726882794.96331: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882794.96530: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 22286 1726882794.96543: variable 'omit' from source: magic vars 22286 1726882794.96546: starting attempt loop 22286 1726882794.96549: running the handler 22286 1726882794.96551: _low_level_execute_command(): starting 22286 1726882794.96554: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22286 1726882794.97286: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882794.97379: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882794.97384: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 22286 1726882794.97388: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882794.97410: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882794.97559: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882794.99395: stdout chunk (state=3): >>>/root <<< 22286 1726882794.99507: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882794.99582: stderr chunk (state=3): >>><<< 22286 1726882794.99585: stdout chunk (state=3): >>><<< 22286 1726882794.99598: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882794.99610: _low_level_execute_command(): starting 22286 1726882794.99669: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882794.9960258-22930-225942546670768 `" && echo ansible-tmp-1726882794.9960258-22930-225942546670768="` echo /root/.ansible/tmp/ansible-tmp-1726882794.9960258-22930-225942546670768 `" ) && sleep 0' 22286 1726882795.00241: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882795.00255: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882795.00267: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882795.00285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22286 1726882795.00295: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 <<< 22286 1726882795.00304: stderr chunk (state=3): >>>debug2: match not found <<< 22286 1726882795.00314: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882795.00330: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 22286 1726882795.00341: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.41.238 is address <<< 22286 1726882795.00350: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 22286 1726882795.00359: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882795.00369: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882795.00394: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22286 1726882795.00446: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 <<< 22286 1726882795.00449: stderr chunk (state=3): >>>debug2: match found <<< 22286 1726882795.00452: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882795.00503: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882795.00507: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882795.00509: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882795.00656: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882795.02801: stdout chunk (state=3): >>>ansible-tmp-1726882794.9960258-22930-225942546670768=/root/.ansible/tmp/ansible-tmp-1726882794.9960258-22930-225942546670768 <<< 22286 1726882795.02950: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882795.02996: stderr chunk (state=3): >>><<< 22286 1726882795.02999: stdout chunk (state=3): >>><<< 22286 1726882795.03015: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882794.9960258-22930-225942546670768=/root/.ansible/tmp/ansible-tmp-1726882794.9960258-22930-225942546670768 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882795.03067: variable 'ansible_module_compression' from source: unknown 22286 1726882795.03148: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 22286 1726882795.03151: ANSIBALLZ: Acquiring lock 22286 1726882795.03155: ANSIBALLZ: Lock acquired: 140212082875504 22286 1726882795.03158: ANSIBALLZ: Creating module 22286 1726882795.21401: ANSIBALLZ: Writing module into payload 22286 1726882795.21740: ANSIBALLZ: Writing module 22286 1726882795.21762: ANSIBALLZ: Renaming module 22286 1726882795.21768: ANSIBALLZ: Done creating module 22286 1726882795.21792: variable 'ansible_facts' from source: unknown 22286 1726882795.21861: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882794.9960258-22930-225942546670768/AnsiballZ_network_connections.py 22286 1726882795.21991: Sending initial data 22286 1726882795.21995: Sent initial data (168 bytes) 22286 1726882795.22437: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882795.22456: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22286 1726882795.22460: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882795.22477: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882795.22544: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882795.22550: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882795.22551: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882795.22667: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882795.24484: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 22286 1726882795.24487: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22286 1726882795.24594: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22286 1726882795.24707: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22286nvkga28q/tmp4w8bxuim /root/.ansible/tmp/ansible-tmp-1726882794.9960258-22930-225942546670768/AnsiballZ_network_connections.py <<< 22286 1726882795.24715: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882794.9960258-22930-225942546670768/AnsiballZ_network_connections.py" <<< 22286 1726882795.24819: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-22286nvkga28q/tmp4w8bxuim" to remote "/root/.ansible/tmp/ansible-tmp-1726882794.9960258-22930-225942546670768/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882794.9960258-22930-225942546670768/AnsiballZ_network_connections.py" <<< 22286 1726882795.26556: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882795.26722: stderr chunk (state=3): >>><<< 22286 1726882795.26726: stdout chunk (state=3): >>><<< 22286 1726882795.26728: done transferring module to remote 22286 1726882795.26731: _low_level_execute_command(): starting 22286 1726882795.26736: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882794.9960258-22930-225942546670768/ /root/.ansible/tmp/ansible-tmp-1726882794.9960258-22930-225942546670768/AnsiballZ_network_connections.py && sleep 0' 22286 1726882795.27187: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882795.27204: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882795.27230: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882795.27268: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882795.27289: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882795.27399: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882795.29422: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882795.29425: stdout chunk (state=3): >>><<< 22286 1726882795.29639: stderr chunk (state=3): >>><<< 22286 1726882795.29644: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882795.29646: _low_level_execute_command(): starting 22286 1726882795.29649: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882794.9960258-22930-225942546670768/AnsiballZ_network_connections.py && sleep 0' 22286 1726882795.30091: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882795.30149: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882795.30224: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882795.30237: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882795.30261: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882795.30409: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882797.53380: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, c4d4cbe6-62c7-4ab2-a39e-4a93fe001a28\n[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, c4d4cbe6-62c7-4ab2-a39e-4a93fe001a28 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "type": "ethernet", "state": "up", "ip": {"dhcp4": false, "auto6": false, "address": ["2001:db8::2/32", "2001:db8::3/32", "2001:db8::4/32"], "gateway6": "2001:db8::1"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "type": "ethernet", "state": "up", "ip": {"dhcp4": false, "auto6": false, "address": ["2001:db8::2/32", "2001:db8::3/32", "2001:db8::4/32"], "gateway6": "2001:db8::1"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 22286 1726882797.55615: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. <<< 22286 1726882797.55619: stdout chunk (state=3): >>><<< 22286 1726882797.55622: stderr chunk (state=3): >>><<< 22286 1726882797.55644: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, c4d4cbe6-62c7-4ab2-a39e-4a93fe001a28\n[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, c4d4cbe6-62c7-4ab2-a39e-4a93fe001a28 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "type": "ethernet", "state": "up", "ip": {"dhcp4": false, "auto6": false, "address": ["2001:db8::2/32", "2001:db8::3/32", "2001:db8::4/32"], "gateway6": "2001:db8::1"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "type": "ethernet", "state": "up", "ip": {"dhcp4": false, "auto6": false, "address": ["2001:db8::2/32", "2001:db8::3/32", "2001:db8::4/32"], "gateway6": "2001:db8::1"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. 22286 1726882797.55714: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'veth0', 'type': 'ethernet', 'state': 'up', 'ip': {'dhcp4': False, 'auto6': False, 'address': ['2001:db8::2/32', '2001:db8::3/32', '2001:db8::4/32'], 'gateway6': '2001:db8::1'}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882794.9960258-22930-225942546670768/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22286 1726882797.55813: _low_level_execute_command(): starting 22286 1726882797.55816: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882794.9960258-22930-225942546670768/ > /dev/null 2>&1 && sleep 0' 22286 1726882797.56359: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882797.56374: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882797.56388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882797.56410: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22286 1726882797.56424: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 <<< 22286 1726882797.56432: stderr chunk (state=3): >>>debug2: match not found <<< 22286 1726882797.56446: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882797.56461: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 22286 1726882797.56470: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.41.238 is address <<< 22286 1726882797.56553: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882797.56572: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882797.56585: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882797.56603: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882797.56751: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882797.58902: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882797.58906: stdout chunk (state=3): >>><<< 22286 1726882797.58914: stderr chunk (state=3): >>><<< 22286 1726882797.58933: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882797.58942: handler run complete 22286 1726882797.58995: attempt loop complete, returning result 22286 1726882797.58999: _execute() done 22286 1726882797.59238: dumping result to json 22286 1726882797.59241: done dumping result, returning 22286 1726882797.59244: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affe814-3a2d-a75d-4836-000000000027] 22286 1726882797.59245: sending task result for task 0affe814-3a2d-a75d-4836-000000000027 22286 1726882797.59322: done sending task result for task 0affe814-3a2d-a75d-4836-000000000027 22286 1726882797.59325: WORKER PROCESS EXITING changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "address": [ "2001:db8::2/32", "2001:db8::3/32", "2001:db8::4/32" ], "auto6": false, "dhcp4": false, "gateway6": "2001:db8::1" }, "name": "veth0", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [003] #0, state:up persistent_state:present, 'veth0': add connection veth0, c4d4cbe6-62c7-4ab2-a39e-4a93fe001a28 [004] #0, state:up persistent_state:present, 'veth0': up connection veth0, c4d4cbe6-62c7-4ab2-a39e-4a93fe001a28 (not-active) 22286 1726882797.59688: no more pending results, returning what we have 22286 1726882797.59694: results queue empty 22286 1726882797.59695: checking for any_errors_fatal 22286 1726882797.59701: done checking for any_errors_fatal 22286 1726882797.59703: checking for max_fail_percentage 22286 1726882797.59705: done checking for max_fail_percentage 22286 1726882797.59706: checking to see if all hosts have failed and the running result is not ok 22286 1726882797.59707: done checking to see if all hosts have failed 22286 1726882797.59708: getting the remaining hosts for this loop 22286 1726882797.59710: done getting the remaining hosts for this loop 22286 1726882797.59715: getting the next task for host managed_node3 22286 1726882797.59723: done getting next task for host managed_node3 22286 1726882797.59728: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 22286 1726882797.59731: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882797.59797: getting variables 22286 1726882797.59800: in VariableManager get_vars() 22286 1726882797.59848: Calling all_inventory to load vars for managed_node3 22286 1726882797.59851: Calling groups_inventory to load vars for managed_node3 22286 1726882797.59855: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882797.59867: Calling all_plugins_play to load vars for managed_node3 22286 1726882797.59871: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882797.59877: Calling groups_plugins_play to load vars for managed_node3 22286 1726882797.62562: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882797.65520: done with get_vars() 22286 1726882797.65557: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:39:57 -0400 (0:00:02.853) 0:00:21.049 ****** 22286 1726882797.65659: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 22286 1726882797.65661: Creating lock for fedora.linux_system_roles.network_state 22286 1726882797.65970: worker is 1 (out of 1 available) 22286 1726882797.65984: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 22286 1726882797.65999: done queuing things up, now waiting for results queue to drain 22286 1726882797.66000: waiting for pending results... 22286 1726882797.66292: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 22286 1726882797.66517: in run() - task 0affe814-3a2d-a75d-4836-000000000028 22286 1726882797.66543: variable 'ansible_search_path' from source: unknown 22286 1726882797.66547: variable 'ansible_search_path' from source: unknown 22286 1726882797.66592: calling self._execute() 22286 1726882797.66709: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882797.66715: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882797.66730: variable 'omit' from source: magic vars 22286 1726882797.67277: variable 'ansible_distribution_major_version' from source: facts 22286 1726882797.67301: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882797.67464: variable 'network_state' from source: role '' defaults 22286 1726882797.67478: Evaluated conditional (network_state != {}): False 22286 1726882797.67484: when evaluation is False, skipping this task 22286 1726882797.67488: _execute() done 22286 1726882797.67491: dumping result to json 22286 1726882797.67496: done dumping result, returning 22286 1726882797.67513: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [0affe814-3a2d-a75d-4836-000000000028] 22286 1726882797.67520: sending task result for task 0affe814-3a2d-a75d-4836-000000000028 22286 1726882797.67620: done sending task result for task 0affe814-3a2d-a75d-4836-000000000028 22286 1726882797.67623: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 22286 1726882797.67681: no more pending results, returning what we have 22286 1726882797.67687: results queue empty 22286 1726882797.67688: checking for any_errors_fatal 22286 1726882797.67703: done checking for any_errors_fatal 22286 1726882797.67704: checking for max_fail_percentage 22286 1726882797.67707: done checking for max_fail_percentage 22286 1726882797.67708: checking to see if all hosts have failed and the running result is not ok 22286 1726882797.67709: done checking to see if all hosts have failed 22286 1726882797.67710: getting the remaining hosts for this loop 22286 1726882797.67711: done getting the remaining hosts for this loop 22286 1726882797.67716: getting the next task for host managed_node3 22286 1726882797.67723: done getting next task for host managed_node3 22286 1726882797.67727: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 22286 1726882797.67731: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882797.67747: getting variables 22286 1726882797.67748: in VariableManager get_vars() 22286 1726882797.67786: Calling all_inventory to load vars for managed_node3 22286 1726882797.67789: Calling groups_inventory to load vars for managed_node3 22286 1726882797.67792: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882797.67802: Calling all_plugins_play to load vars for managed_node3 22286 1726882797.67805: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882797.67808: Calling groups_plugins_play to load vars for managed_node3 22286 1726882797.70306: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882797.73269: done with get_vars() 22286 1726882797.73308: done getting variables 22286 1726882797.73381: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:39:57 -0400 (0:00:00.077) 0:00:21.127 ****** 22286 1726882797.73420: entering _queue_task() for managed_node3/debug 22286 1726882797.73776: worker is 1 (out of 1 available) 22286 1726882797.73791: exiting _queue_task() for managed_node3/debug 22286 1726882797.73805: done queuing things up, now waiting for results queue to drain 22286 1726882797.73807: waiting for pending results... 22286 1726882797.74259: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 22286 1726882797.74295: in run() - task 0affe814-3a2d-a75d-4836-000000000029 22286 1726882797.74319: variable 'ansible_search_path' from source: unknown 22286 1726882797.74328: variable 'ansible_search_path' from source: unknown 22286 1726882797.74380: calling self._execute() 22286 1726882797.74570: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882797.74574: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882797.74578: variable 'omit' from source: magic vars 22286 1726882797.74965: variable 'ansible_distribution_major_version' from source: facts 22286 1726882797.74985: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882797.74999: variable 'omit' from source: magic vars 22286 1726882797.75077: variable 'omit' from source: magic vars 22286 1726882797.75137: variable 'omit' from source: magic vars 22286 1726882797.75225: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22286 1726882797.75245: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22286 1726882797.75273: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22286 1726882797.75300: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882797.75317: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882797.75362: variable 'inventory_hostname' from source: host vars for 'managed_node3' 22286 1726882797.75441: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882797.75445: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882797.75513: Set connection var ansible_shell_executable to /bin/sh 22286 1726882797.75529: Set connection var ansible_module_compression to ZIP_DEFLATED 22286 1726882797.75539: Set connection var ansible_connection to ssh 22286 1726882797.75550: Set connection var ansible_shell_type to sh 22286 1726882797.75562: Set connection var ansible_timeout to 10 22286 1726882797.75576: Set connection var ansible_pipelining to False 22286 1726882797.75606: variable 'ansible_shell_executable' from source: unknown 22286 1726882797.75615: variable 'ansible_connection' from source: unknown 22286 1726882797.75623: variable 'ansible_module_compression' from source: unknown 22286 1726882797.75630: variable 'ansible_shell_type' from source: unknown 22286 1726882797.75641: variable 'ansible_shell_executable' from source: unknown 22286 1726882797.75650: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882797.75663: variable 'ansible_pipelining' from source: unknown 22286 1726882797.75671: variable 'ansible_timeout' from source: unknown 22286 1726882797.75839: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882797.75843: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22286 1726882797.75863: variable 'omit' from source: magic vars 22286 1726882797.75873: starting attempt loop 22286 1726882797.75881: running the handler 22286 1726882797.76038: variable '__network_connections_result' from source: set_fact 22286 1726882797.76111: handler run complete 22286 1726882797.76142: attempt loop complete, returning result 22286 1726882797.76150: _execute() done 22286 1726882797.76157: dumping result to json 22286 1726882797.76166: done dumping result, returning 22286 1726882797.76184: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affe814-3a2d-a75d-4836-000000000029] 22286 1726882797.76195: sending task result for task 0affe814-3a2d-a75d-4836-000000000029 ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, c4d4cbe6-62c7-4ab2-a39e-4a93fe001a28", "[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, c4d4cbe6-62c7-4ab2-a39e-4a93fe001a28 (not-active)" ] } 22286 1726882797.76431: no more pending results, returning what we have 22286 1726882797.76437: results queue empty 22286 1726882797.76439: checking for any_errors_fatal 22286 1726882797.76447: done checking for any_errors_fatal 22286 1726882797.76448: checking for max_fail_percentage 22286 1726882797.76450: done checking for max_fail_percentage 22286 1726882797.76452: checking to see if all hosts have failed and the running result is not ok 22286 1726882797.76453: done checking to see if all hosts have failed 22286 1726882797.76454: getting the remaining hosts for this loop 22286 1726882797.76456: done getting the remaining hosts for this loop 22286 1726882797.76460: getting the next task for host managed_node3 22286 1726882797.76468: done getting next task for host managed_node3 22286 1726882797.76473: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 22286 1726882797.76477: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882797.76492: getting variables 22286 1726882797.76494: in VariableManager get_vars() 22286 1726882797.76743: Calling all_inventory to load vars for managed_node3 22286 1726882797.76747: Calling groups_inventory to load vars for managed_node3 22286 1726882797.76750: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882797.76760: Calling all_plugins_play to load vars for managed_node3 22286 1726882797.76764: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882797.76768: Calling groups_plugins_play to load vars for managed_node3 22286 1726882797.77456: done sending task result for task 0affe814-3a2d-a75d-4836-000000000029 22286 1726882797.77460: WORKER PROCESS EXITING 22286 1726882797.79414: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882797.83167: done with get_vars() 22286 1726882797.83214: done getting variables 22286 1726882797.83290: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:39:57 -0400 (0:00:00.099) 0:00:21.226 ****** 22286 1726882797.83330: entering _queue_task() for managed_node3/debug 22286 1726882797.83804: worker is 1 (out of 1 available) 22286 1726882797.83818: exiting _queue_task() for managed_node3/debug 22286 1726882797.83831: done queuing things up, now waiting for results queue to drain 22286 1726882797.83833: waiting for pending results... 22286 1726882797.84351: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 22286 1726882797.84704: in run() - task 0affe814-3a2d-a75d-4836-00000000002a 22286 1726882797.84962: variable 'ansible_search_path' from source: unknown 22286 1726882797.84971: variable 'ansible_search_path' from source: unknown 22286 1726882797.85024: calling self._execute() 22286 1726882797.85640: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882797.85644: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882797.85647: variable 'omit' from source: magic vars 22286 1726882797.85962: variable 'ansible_distribution_major_version' from source: facts 22286 1726882797.85983: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882797.85996: variable 'omit' from source: magic vars 22286 1726882797.86077: variable 'omit' from source: magic vars 22286 1726882797.86126: variable 'omit' from source: magic vars 22286 1726882797.86177: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22286 1726882797.86222: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22286 1726882797.86252: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22286 1726882797.86280: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882797.86355: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882797.86395: variable 'inventory_hostname' from source: host vars for 'managed_node3' 22286 1726882797.86740: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882797.86744: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882797.86746: Set connection var ansible_shell_executable to /bin/sh 22286 1726882797.86749: Set connection var ansible_module_compression to ZIP_DEFLATED 22286 1726882797.86751: Set connection var ansible_connection to ssh 22286 1726882797.86753: Set connection var ansible_shell_type to sh 22286 1726882797.86755: Set connection var ansible_timeout to 10 22286 1726882797.86757: Set connection var ansible_pipelining to False 22286 1726882797.87025: variable 'ansible_shell_executable' from source: unknown 22286 1726882797.87148: variable 'ansible_connection' from source: unknown 22286 1726882797.87157: variable 'ansible_module_compression' from source: unknown 22286 1726882797.87165: variable 'ansible_shell_type' from source: unknown 22286 1726882797.87172: variable 'ansible_shell_executable' from source: unknown 22286 1726882797.87179: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882797.87187: variable 'ansible_pipelining' from source: unknown 22286 1726882797.87194: variable 'ansible_timeout' from source: unknown 22286 1726882797.87203: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882797.87477: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22286 1726882797.87840: variable 'omit' from source: magic vars 22286 1726882797.87844: starting attempt loop 22286 1726882797.87846: running the handler 22286 1726882797.87848: variable '__network_connections_result' from source: set_fact 22286 1726882797.87851: variable '__network_connections_result' from source: set_fact 22286 1726882797.88096: handler run complete 22286 1726882797.88281: attempt loop complete, returning result 22286 1726882797.88347: _execute() done 22286 1726882797.88355: dumping result to json 22286 1726882797.88365: done dumping result, returning 22286 1726882797.88642: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affe814-3a2d-a75d-4836-00000000002a] 22286 1726882797.88646: sending task result for task 0affe814-3a2d-a75d-4836-00000000002a 22286 1726882797.88724: done sending task result for task 0affe814-3a2d-a75d-4836-00000000002a 22286 1726882797.88727: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "address": [ "2001:db8::2/32", "2001:db8::3/32", "2001:db8::4/32" ], "auto6": false, "dhcp4": false, "gateway6": "2001:db8::1" }, "name": "veth0", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, c4d4cbe6-62c7-4ab2-a39e-4a93fe001a28\n[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, c4d4cbe6-62c7-4ab2-a39e-4a93fe001a28 (not-active)\n", "stderr_lines": [ "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, c4d4cbe6-62c7-4ab2-a39e-4a93fe001a28", "[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, c4d4cbe6-62c7-4ab2-a39e-4a93fe001a28 (not-active)" ] } } 22286 1726882797.88857: no more pending results, returning what we have 22286 1726882797.88860: results queue empty 22286 1726882797.88861: checking for any_errors_fatal 22286 1726882797.88869: done checking for any_errors_fatal 22286 1726882797.88870: checking for max_fail_percentage 22286 1726882797.88872: done checking for max_fail_percentage 22286 1726882797.88873: checking to see if all hosts have failed and the running result is not ok 22286 1726882797.88874: done checking to see if all hosts have failed 22286 1726882797.88877: getting the remaining hosts for this loop 22286 1726882797.88879: done getting the remaining hosts for this loop 22286 1726882797.88884: getting the next task for host managed_node3 22286 1726882797.88892: done getting next task for host managed_node3 22286 1726882797.88897: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 22286 1726882797.88900: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882797.88913: getting variables 22286 1726882797.88914: in VariableManager get_vars() 22286 1726882797.89264: Calling all_inventory to load vars for managed_node3 22286 1726882797.89273: Calling groups_inventory to load vars for managed_node3 22286 1726882797.89279: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882797.89291: Calling all_plugins_play to load vars for managed_node3 22286 1726882797.89295: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882797.89298: Calling groups_plugins_play to load vars for managed_node3 22286 1726882797.93415: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882797.96335: done with get_vars() 22286 1726882797.96377: done getting variables 22286 1726882797.96450: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:39:57 -0400 (0:00:00.131) 0:00:21.358 ****** 22286 1726882797.96491: entering _queue_task() for managed_node3/debug 22286 1726882797.96849: worker is 1 (out of 1 available) 22286 1726882797.96863: exiting _queue_task() for managed_node3/debug 22286 1726882797.96876: done queuing things up, now waiting for results queue to drain 22286 1726882797.96878: waiting for pending results... 22286 1726882797.97190: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 22286 1726882797.97371: in run() - task 0affe814-3a2d-a75d-4836-00000000002b 22286 1726882797.97394: variable 'ansible_search_path' from source: unknown 22286 1726882797.97402: variable 'ansible_search_path' from source: unknown 22286 1726882797.97450: calling self._execute() 22286 1726882797.97562: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882797.97582: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882797.97602: variable 'omit' from source: magic vars 22286 1726882797.98031: variable 'ansible_distribution_major_version' from source: facts 22286 1726882797.98052: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882797.98214: variable 'network_state' from source: role '' defaults 22286 1726882797.98238: Evaluated conditional (network_state != {}): False 22286 1726882797.98247: when evaluation is False, skipping this task 22286 1726882797.98253: _execute() done 22286 1726882797.98260: dumping result to json 22286 1726882797.98269: done dumping result, returning 22286 1726882797.98280: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affe814-3a2d-a75d-4836-00000000002b] 22286 1726882797.98290: sending task result for task 0affe814-3a2d-a75d-4836-00000000002b skipping: [managed_node3] => { "false_condition": "network_state != {}" } 22286 1726882797.98456: no more pending results, returning what we have 22286 1726882797.98461: results queue empty 22286 1726882797.98462: checking for any_errors_fatal 22286 1726882797.98475: done checking for any_errors_fatal 22286 1726882797.98476: checking for max_fail_percentage 22286 1726882797.98480: done checking for max_fail_percentage 22286 1726882797.98482: checking to see if all hosts have failed and the running result is not ok 22286 1726882797.98483: done checking to see if all hosts have failed 22286 1726882797.98485: getting the remaining hosts for this loop 22286 1726882797.98487: done getting the remaining hosts for this loop 22286 1726882797.98491: getting the next task for host managed_node3 22286 1726882797.98500: done getting next task for host managed_node3 22286 1726882797.98505: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 22286 1726882797.98509: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882797.98529: getting variables 22286 1726882797.98531: in VariableManager get_vars() 22286 1726882797.98581: Calling all_inventory to load vars for managed_node3 22286 1726882797.98585: Calling groups_inventory to load vars for managed_node3 22286 1726882797.98588: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882797.98602: Calling all_plugins_play to load vars for managed_node3 22286 1726882797.98606: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882797.98610: Calling groups_plugins_play to load vars for managed_node3 22286 1726882797.99606: done sending task result for task 0affe814-3a2d-a75d-4836-00000000002b 22286 1726882797.99609: WORKER PROCESS EXITING 22286 1726882798.03713: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882798.07997: done with get_vars() 22286 1726882798.08246: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:39:58 -0400 (0:00:00.119) 0:00:21.477 ****** 22286 1726882798.08406: entering _queue_task() for managed_node3/ping 22286 1726882798.08408: Creating lock for ping 22286 1726882798.09193: worker is 1 (out of 1 available) 22286 1726882798.09280: exiting _queue_task() for managed_node3/ping 22286 1726882798.09295: done queuing things up, now waiting for results queue to drain 22286 1726882798.09297: waiting for pending results... 22286 1726882798.09695: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 22286 1726882798.09901: in run() - task 0affe814-3a2d-a75d-4836-00000000002c 22286 1726882798.09924: variable 'ansible_search_path' from source: unknown 22286 1726882798.09933: variable 'ansible_search_path' from source: unknown 22286 1726882798.09987: calling self._execute() 22286 1726882798.10107: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882798.10185: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882798.10192: variable 'omit' from source: magic vars 22286 1726882798.10972: variable 'ansible_distribution_major_version' from source: facts 22286 1726882798.10994: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882798.11006: variable 'omit' from source: magic vars 22286 1726882798.11091: variable 'omit' from source: magic vars 22286 1726882798.11137: variable 'omit' from source: magic vars 22286 1726882798.11193: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22286 1726882798.11243: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22286 1726882798.11285: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22286 1726882798.11304: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882798.11393: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882798.11396: variable 'inventory_hostname' from source: host vars for 'managed_node3' 22286 1726882798.11399: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882798.11402: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882798.11518: Set connection var ansible_shell_executable to /bin/sh 22286 1726882798.11535: Set connection var ansible_module_compression to ZIP_DEFLATED 22286 1726882798.11543: Set connection var ansible_connection to ssh 22286 1726882798.11551: Set connection var ansible_shell_type to sh 22286 1726882798.11562: Set connection var ansible_timeout to 10 22286 1726882798.11578: Set connection var ansible_pipelining to False 22286 1726882798.11616: variable 'ansible_shell_executable' from source: unknown 22286 1726882798.11626: variable 'ansible_connection' from source: unknown 22286 1726882798.11637: variable 'ansible_module_compression' from source: unknown 22286 1726882798.11645: variable 'ansible_shell_type' from source: unknown 22286 1726882798.11652: variable 'ansible_shell_executable' from source: unknown 22286 1726882798.11659: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882798.11668: variable 'ansible_pipelining' from source: unknown 22286 1726882798.11723: variable 'ansible_timeout' from source: unknown 22286 1726882798.11727: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882798.11950: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 22286 1726882798.11968: variable 'omit' from source: magic vars 22286 1726882798.11982: starting attempt loop 22286 1726882798.11989: running the handler 22286 1726882798.12010: _low_level_execute_command(): starting 22286 1726882798.12024: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22286 1726882798.12806: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882798.12931: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882798.12958: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882798.12973: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882798.12992: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882798.13019: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882798.13267: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882798.15115: stdout chunk (state=3): >>>/root <<< 22286 1726882798.15252: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882798.15311: stderr chunk (state=3): >>><<< 22286 1726882798.15329: stdout chunk (state=3): >>><<< 22286 1726882798.15364: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882798.15390: _low_level_execute_command(): starting 22286 1726882798.15409: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882798.1537352-23032-190244819565925 `" && echo ansible-tmp-1726882798.1537352-23032-190244819565925="` echo /root/.ansible/tmp/ansible-tmp-1726882798.1537352-23032-190244819565925 `" ) && sleep 0' 22286 1726882798.16043: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882798.16058: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882798.16073: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882798.16099: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22286 1726882798.16117: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 <<< 22286 1726882798.16215: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882798.16257: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882798.16277: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882798.16298: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882798.16451: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882798.18566: stdout chunk (state=3): >>>ansible-tmp-1726882798.1537352-23032-190244819565925=/root/.ansible/tmp/ansible-tmp-1726882798.1537352-23032-190244819565925 <<< 22286 1726882798.18766: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882798.18799: stdout chunk (state=3): >>><<< 22286 1726882798.18803: stderr chunk (state=3): >>><<< 22286 1726882798.18823: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882798.1537352-23032-190244819565925=/root/.ansible/tmp/ansible-tmp-1726882798.1537352-23032-190244819565925 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882798.18913: variable 'ansible_module_compression' from source: unknown 22286 1726882798.18955: ANSIBALLZ: Using lock for ping 22286 1726882798.18964: ANSIBALLZ: Acquiring lock 22286 1726882798.18972: ANSIBALLZ: Lock acquired: 140212083392928 22286 1726882798.18981: ANSIBALLZ: Creating module 22286 1726882798.36840: ANSIBALLZ: Writing module into payload 22286 1726882798.36911: ANSIBALLZ: Writing module 22286 1726882798.36951: ANSIBALLZ: Renaming module 22286 1726882798.36954: ANSIBALLZ: Done creating module 22286 1726882798.37040: variable 'ansible_facts' from source: unknown 22286 1726882798.37063: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882798.1537352-23032-190244819565925/AnsiballZ_ping.py 22286 1726882798.37318: Sending initial data 22286 1726882798.37321: Sent initial data (153 bytes) 22286 1726882798.37906: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882798.37921: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882798.37941: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882798.37974: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882798.38063: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882798.38099: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882798.38123: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882798.38140: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882798.38292: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882798.40126: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22286 1726882798.40263: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22286 1726882798.40413: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22286nvkga28q/tmpylmasero /root/.ansible/tmp/ansible-tmp-1726882798.1537352-23032-190244819565925/AnsiballZ_ping.py <<< 22286 1726882798.40416: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882798.1537352-23032-190244819565925/AnsiballZ_ping.py" <<< 22286 1726882798.40524: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-22286nvkga28q/tmpylmasero" to remote "/root/.ansible/tmp/ansible-tmp-1726882798.1537352-23032-190244819565925/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882798.1537352-23032-190244819565925/AnsiballZ_ping.py" <<< 22286 1726882798.42162: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882798.42215: stderr chunk (state=3): >>><<< 22286 1726882798.42226: stdout chunk (state=3): >>><<< 22286 1726882798.42281: done transferring module to remote 22286 1726882798.42306: _low_level_execute_command(): starting 22286 1726882798.42318: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882798.1537352-23032-190244819565925/ /root/.ansible/tmp/ansible-tmp-1726882798.1537352-23032-190244819565925/AnsiballZ_ping.py && sleep 0' 22286 1726882798.42969: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882798.42980: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882798.42997: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882798.43022: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22286 1726882798.43026: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 <<< 22286 1726882798.43035: stderr chunk (state=3): >>>debug2: match not found <<< 22286 1726882798.43040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882798.43143: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 22286 1726882798.43151: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882798.43181: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882798.43300: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882798.45561: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882798.45564: stdout chunk (state=3): >>><<< 22286 1726882798.45567: stderr chunk (state=3): >>><<< 22286 1726882798.45569: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882798.45572: _low_level_execute_command(): starting 22286 1726882798.45574: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882798.1537352-23032-190244819565925/AnsiballZ_ping.py && sleep 0' 22286 1726882798.46761: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882798.46786: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882798.46849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882798.46987: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882798.46991: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882798.46994: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882798.47215: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882798.64546: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 22286 1726882798.66483: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. <<< 22286 1726882798.66488: stdout chunk (state=3): >>><<< 22286 1726882798.66490: stderr chunk (state=3): >>><<< 22286 1726882798.66493: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. 22286 1726882798.66496: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882798.1537352-23032-190244819565925/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22286 1726882798.66498: _low_level_execute_command(): starting 22286 1726882798.66501: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882798.1537352-23032-190244819565925/ > /dev/null 2>&1 && sleep 0' 22286 1726882798.67187: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882798.67191: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882798.67212: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found <<< 22286 1726882798.67216: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882798.67279: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882798.67314: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882798.67427: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882798.69700: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882798.69704: stdout chunk (state=3): >>><<< 22286 1726882798.69707: stderr chunk (state=3): >>><<< 22286 1726882798.69710: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882798.69718: handler run complete 22286 1726882798.69917: attempt loop complete, returning result 22286 1726882798.69926: _execute() done 22286 1726882798.69929: dumping result to json 22286 1726882798.69931: done dumping result, returning 22286 1726882798.69936: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affe814-3a2d-a75d-4836-00000000002c] 22286 1726882798.69938: sending task result for task 0affe814-3a2d-a75d-4836-00000000002c ok: [managed_node3] => { "changed": false, "ping": "pong" } 22286 1726882798.70097: no more pending results, returning what we have 22286 1726882798.70102: results queue empty 22286 1726882798.70103: checking for any_errors_fatal 22286 1726882798.70111: done checking for any_errors_fatal 22286 1726882798.70112: checking for max_fail_percentage 22286 1726882798.70115: done checking for max_fail_percentage 22286 1726882798.70116: checking to see if all hosts have failed and the running result is not ok 22286 1726882798.70118: done checking to see if all hosts have failed 22286 1726882798.70119: getting the remaining hosts for this loop 22286 1726882798.70121: done getting the remaining hosts for this loop 22286 1726882798.70129: getting the next task for host managed_node3 22286 1726882798.70342: done getting next task for host managed_node3 22286 1726882798.70350: ^ task is: TASK: meta (role_complete) 22286 1726882798.70355: ^ state is: HOST STATE: block=3, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882798.70370: getting variables 22286 1726882798.70372: in VariableManager get_vars() 22286 1726882798.70424: Calling all_inventory to load vars for managed_node3 22286 1726882798.70428: Calling groups_inventory to load vars for managed_node3 22286 1726882798.70431: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882798.70603: done sending task result for task 0affe814-3a2d-a75d-4836-00000000002c 22286 1726882798.70607: WORKER PROCESS EXITING 22286 1726882798.70620: Calling all_plugins_play to load vars for managed_node3 22286 1726882798.70625: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882798.70630: Calling groups_plugins_play to load vars for managed_node3 22286 1726882798.74844: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882798.78796: done with get_vars() 22286 1726882798.78831: done getting variables 22286 1726882798.78904: done queuing things up, now waiting for results queue to drain 22286 1726882798.78906: results queue empty 22286 1726882798.78906: checking for any_errors_fatal 22286 1726882798.78909: done checking for any_errors_fatal 22286 1726882798.78909: checking for max_fail_percentage 22286 1726882798.78910: done checking for max_fail_percentage 22286 1726882798.78911: checking to see if all hosts have failed and the running result is not ok 22286 1726882798.78912: done checking to see if all hosts have failed 22286 1726882798.78912: getting the remaining hosts for this loop 22286 1726882798.78913: done getting the remaining hosts for this loop 22286 1726882798.78915: getting the next task for host managed_node3 22286 1726882798.78920: done getting next task for host managed_node3 22286 1726882798.78922: ^ task is: TASK: Include the task 'assert_device_present.yml' 22286 1726882798.78923: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882798.78925: getting variables 22286 1726882798.78926: in VariableManager get_vars() 22286 1726882798.78940: Calling all_inventory to load vars for managed_node3 22286 1726882798.78942: Calling groups_inventory to load vars for managed_node3 22286 1726882798.78944: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882798.78948: Calling all_plugins_play to load vars for managed_node3 22286 1726882798.78949: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882798.78952: Calling groups_plugins_play to load vars for managed_node3 22286 1726882798.80051: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882798.82148: done with get_vars() 22286 1726882798.82169: done getting variables TASK [Include the task 'assert_device_present.yml'] **************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:47 Friday 20 September 2024 21:39:58 -0400 (0:00:00.738) 0:00:22.215 ****** 22286 1726882798.82238: entering _queue_task() for managed_node3/include_tasks 22286 1726882798.82510: worker is 1 (out of 1 available) 22286 1726882798.82524: exiting _queue_task() for managed_node3/include_tasks 22286 1726882798.82539: done queuing things up, now waiting for results queue to drain 22286 1726882798.82541: waiting for pending results... 22286 1726882798.82723: running TaskExecutor() for managed_node3/TASK: Include the task 'assert_device_present.yml' 22286 1726882798.82807: in run() - task 0affe814-3a2d-a75d-4836-00000000005c 22286 1726882798.82819: variable 'ansible_search_path' from source: unknown 22286 1726882798.82855: calling self._execute() 22286 1726882798.82939: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882798.82945: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882798.82956: variable 'omit' from source: magic vars 22286 1726882798.83282: variable 'ansible_distribution_major_version' from source: facts 22286 1726882798.83292: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882798.83298: _execute() done 22286 1726882798.83301: dumping result to json 22286 1726882798.83308: done dumping result, returning 22286 1726882798.83316: done running TaskExecutor() for managed_node3/TASK: Include the task 'assert_device_present.yml' [0affe814-3a2d-a75d-4836-00000000005c] 22286 1726882798.83321: sending task result for task 0affe814-3a2d-a75d-4836-00000000005c 22286 1726882798.83423: done sending task result for task 0affe814-3a2d-a75d-4836-00000000005c 22286 1726882798.83426: WORKER PROCESS EXITING 22286 1726882798.83460: no more pending results, returning what we have 22286 1726882798.83465: in VariableManager get_vars() 22286 1726882798.83519: Calling all_inventory to load vars for managed_node3 22286 1726882798.83522: Calling groups_inventory to load vars for managed_node3 22286 1726882798.83525: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882798.83552: Calling all_plugins_play to load vars for managed_node3 22286 1726882798.83557: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882798.83561: Calling groups_plugins_play to load vars for managed_node3 22286 1726882798.89496: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882798.91062: done with get_vars() 22286 1726882798.91096: variable 'ansible_search_path' from source: unknown 22286 1726882798.91110: we have included files to process 22286 1726882798.91111: generating all_blocks data 22286 1726882798.91113: done generating all_blocks data 22286 1726882798.91116: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 22286 1726882798.91117: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 22286 1726882798.91120: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 22286 1726882798.91296: in VariableManager get_vars() 22286 1726882798.91324: done with get_vars() 22286 1726882798.91455: done processing included file 22286 1726882798.91457: iterating over new_blocks loaded from include file 22286 1726882798.91459: in VariableManager get_vars() 22286 1726882798.91482: done with get_vars() 22286 1726882798.91485: filtering new block on tags 22286 1726882798.91505: done filtering new block on tags 22286 1726882798.91508: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed_node3 22286 1726882798.91513: extending task lists for all hosts with included blocks 22286 1726882798.94538: done extending task lists 22286 1726882798.94540: done processing included files 22286 1726882798.94541: results queue empty 22286 1726882798.94542: checking for any_errors_fatal 22286 1726882798.94544: done checking for any_errors_fatal 22286 1726882798.94545: checking for max_fail_percentage 22286 1726882798.94546: done checking for max_fail_percentage 22286 1726882798.94547: checking to see if all hosts have failed and the running result is not ok 22286 1726882798.94548: done checking to see if all hosts have failed 22286 1726882798.94549: getting the remaining hosts for this loop 22286 1726882798.94551: done getting the remaining hosts for this loop 22286 1726882798.94554: getting the next task for host managed_node3 22286 1726882798.94559: done getting next task for host managed_node3 22286 1726882798.94561: ^ task is: TASK: Include the task 'get_interface_stat.yml' 22286 1726882798.94564: ^ state is: HOST STATE: block=3, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882798.94567: getting variables 22286 1726882798.94568: in VariableManager get_vars() 22286 1726882798.94588: Calling all_inventory to load vars for managed_node3 22286 1726882798.94591: Calling groups_inventory to load vars for managed_node3 22286 1726882798.94594: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882798.94601: Calling all_plugins_play to load vars for managed_node3 22286 1726882798.94605: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882798.94609: Calling groups_plugins_play to load vars for managed_node3 22286 1726882798.96764: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882798.98329: done with get_vars() 22286 1726882798.98353: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 21:39:58 -0400 (0:00:00.161) 0:00:22.377 ****** 22286 1726882798.98409: entering _queue_task() for managed_node3/include_tasks 22286 1726882798.98666: worker is 1 (out of 1 available) 22286 1726882798.98676: exiting _queue_task() for managed_node3/include_tasks 22286 1726882798.98689: done queuing things up, now waiting for results queue to drain 22286 1726882798.98691: waiting for pending results... 22286 1726882798.98883: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 22286 1726882798.98966: in run() - task 0affe814-3a2d-a75d-4836-0000000002b5 22286 1726882798.98978: variable 'ansible_search_path' from source: unknown 22286 1726882798.98983: variable 'ansible_search_path' from source: unknown 22286 1726882798.99016: calling self._execute() 22286 1726882798.99102: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882798.99109: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882798.99120: variable 'omit' from source: magic vars 22286 1726882798.99450: variable 'ansible_distribution_major_version' from source: facts 22286 1726882798.99463: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882798.99468: _execute() done 22286 1726882798.99476: dumping result to json 22286 1726882798.99479: done dumping result, returning 22286 1726882798.99486: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [0affe814-3a2d-a75d-4836-0000000002b5] 22286 1726882798.99492: sending task result for task 0affe814-3a2d-a75d-4836-0000000002b5 22286 1726882798.99583: done sending task result for task 0affe814-3a2d-a75d-4836-0000000002b5 22286 1726882798.99586: WORKER PROCESS EXITING 22286 1726882798.99616: no more pending results, returning what we have 22286 1726882798.99621: in VariableManager get_vars() 22286 1726882798.99672: Calling all_inventory to load vars for managed_node3 22286 1726882798.99676: Calling groups_inventory to load vars for managed_node3 22286 1726882798.99679: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882798.99693: Calling all_plugins_play to load vars for managed_node3 22286 1726882798.99697: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882798.99700: Calling groups_plugins_play to load vars for managed_node3 22286 1726882799.00923: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882799.02493: done with get_vars() 22286 1726882799.02511: variable 'ansible_search_path' from source: unknown 22286 1726882799.02512: variable 'ansible_search_path' from source: unknown 22286 1726882799.02541: we have included files to process 22286 1726882799.02542: generating all_blocks data 22286 1726882799.02543: done generating all_blocks data 22286 1726882799.02544: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 22286 1726882799.02545: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 22286 1726882799.02547: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 22286 1726882799.02728: done processing included file 22286 1726882799.02730: iterating over new_blocks loaded from include file 22286 1726882799.02731: in VariableManager get_vars() 22286 1726882799.02748: done with get_vars() 22286 1726882799.02749: filtering new block on tags 22286 1726882799.02761: done filtering new block on tags 22286 1726882799.02762: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 22286 1726882799.02766: extending task lists for all hosts with included blocks 22286 1726882799.02849: done extending task lists 22286 1726882799.02851: done processing included files 22286 1726882799.02851: results queue empty 22286 1726882799.02852: checking for any_errors_fatal 22286 1726882799.02855: done checking for any_errors_fatal 22286 1726882799.02855: checking for max_fail_percentage 22286 1726882799.02856: done checking for max_fail_percentage 22286 1726882799.02857: checking to see if all hosts have failed and the running result is not ok 22286 1726882799.02857: done checking to see if all hosts have failed 22286 1726882799.02858: getting the remaining hosts for this loop 22286 1726882799.02859: done getting the remaining hosts for this loop 22286 1726882799.02861: getting the next task for host managed_node3 22286 1726882799.02864: done getting next task for host managed_node3 22286 1726882799.02866: ^ task is: TASK: Get stat for interface {{ interface }} 22286 1726882799.02868: ^ state is: HOST STATE: block=3, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882799.02870: getting variables 22286 1726882799.02870: in VariableManager get_vars() 22286 1726882799.02885: Calling all_inventory to load vars for managed_node3 22286 1726882799.02886: Calling groups_inventory to load vars for managed_node3 22286 1726882799.02888: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882799.02893: Calling all_plugins_play to load vars for managed_node3 22286 1726882799.02895: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882799.02898: Calling groups_plugins_play to load vars for managed_node3 22286 1726882799.04049: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882799.05613: done with get_vars() 22286 1726882799.05635: done getting variables 22286 1726882799.05761: variable 'interface' from source: play vars TASK [Get stat for interface veth0] ******************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:39:59 -0400 (0:00:00.073) 0:00:22.451 ****** 22286 1726882799.05784: entering _queue_task() for managed_node3/stat 22286 1726882799.06019: worker is 1 (out of 1 available) 22286 1726882799.06031: exiting _queue_task() for managed_node3/stat 22286 1726882799.06046: done queuing things up, now waiting for results queue to drain 22286 1726882799.06048: waiting for pending results... 22286 1726882799.06226: running TaskExecutor() for managed_node3/TASK: Get stat for interface veth0 22286 1726882799.06314: in run() - task 0affe814-3a2d-a75d-4836-0000000003a0 22286 1726882799.06325: variable 'ansible_search_path' from source: unknown 22286 1726882799.06329: variable 'ansible_search_path' from source: unknown 22286 1726882799.06361: calling self._execute() 22286 1726882799.06443: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882799.06449: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882799.06460: variable 'omit' from source: magic vars 22286 1726882799.06941: variable 'ansible_distribution_major_version' from source: facts 22286 1726882799.06945: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882799.06948: variable 'omit' from source: magic vars 22286 1726882799.06999: variable 'omit' from source: magic vars 22286 1726882799.07130: variable 'interface' from source: play vars 22286 1726882799.07157: variable 'omit' from source: magic vars 22286 1726882799.07217: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22286 1726882799.07265: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22286 1726882799.07304: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22286 1726882799.07404: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882799.07408: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882799.07411: variable 'inventory_hostname' from source: host vars for 'managed_node3' 22286 1726882799.07414: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882799.07417: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882799.07554: Set connection var ansible_shell_executable to /bin/sh 22286 1726882799.07572: Set connection var ansible_module_compression to ZIP_DEFLATED 22286 1726882799.07581: Set connection var ansible_connection to ssh 22286 1726882799.07588: Set connection var ansible_shell_type to sh 22286 1726882799.07599: Set connection var ansible_timeout to 10 22286 1726882799.07624: Set connection var ansible_pipelining to False 22286 1726882799.07658: variable 'ansible_shell_executable' from source: unknown 22286 1726882799.07683: variable 'ansible_connection' from source: unknown 22286 1726882799.07687: variable 'ansible_module_compression' from source: unknown 22286 1726882799.07690: variable 'ansible_shell_type' from source: unknown 22286 1726882799.07693: variable 'ansible_shell_executable' from source: unknown 22286 1726882799.07711: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882799.07715: variable 'ansible_pipelining' from source: unknown 22286 1726882799.07717: variable 'ansible_timeout' from source: unknown 22286 1726882799.07719: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882799.07908: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 22286 1726882799.07920: variable 'omit' from source: magic vars 22286 1726882799.07926: starting attempt loop 22286 1726882799.07929: running the handler 22286 1726882799.07951: _low_level_execute_command(): starting 22286 1726882799.07955: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22286 1726882799.08466: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882799.08470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882799.08474: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 <<< 22286 1726882799.08477: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882799.08528: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882799.08532: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882799.08660: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882799.10533: stdout chunk (state=3): >>>/root <<< 22286 1726882799.10692: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882799.10726: stderr chunk (state=3): >>><<< 22286 1726882799.10743: stdout chunk (state=3): >>><<< 22286 1726882799.10771: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882799.10800: _low_level_execute_command(): starting 22286 1726882799.10892: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882799.1077864-23072-144458974125080 `" && echo ansible-tmp-1726882799.1077864-23072-144458974125080="` echo /root/.ansible/tmp/ansible-tmp-1726882799.1077864-23072-144458974125080 `" ) && sleep 0' 22286 1726882799.11451: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882799.11555: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 22286 1726882799.11583: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882799.11601: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882799.11809: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882799.13901: stdout chunk (state=3): >>>ansible-tmp-1726882799.1077864-23072-144458974125080=/root/.ansible/tmp/ansible-tmp-1726882799.1077864-23072-144458974125080 <<< 22286 1726882799.14106: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882799.14109: stdout chunk (state=3): >>><<< 22286 1726882799.14111: stderr chunk (state=3): >>><<< 22286 1726882799.14133: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882799.1077864-23072-144458974125080=/root/.ansible/tmp/ansible-tmp-1726882799.1077864-23072-144458974125080 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882799.14340: variable 'ansible_module_compression' from source: unknown 22286 1726882799.14343: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22286nvkga28q/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 22286 1726882799.14346: variable 'ansible_facts' from source: unknown 22286 1726882799.14383: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882799.1077864-23072-144458974125080/AnsiballZ_stat.py 22286 1726882799.14596: Sending initial data 22286 1726882799.14599: Sent initial data (153 bytes) 22286 1726882799.16170: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882799.16275: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882799.16300: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882799.16561: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882799.18262: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22286 1726882799.18372: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22286 1726882799.18489: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22286nvkga28q/tmp7w_t8fyb /root/.ansible/tmp/ansible-tmp-1726882799.1077864-23072-144458974125080/AnsiballZ_stat.py <<< 22286 1726882799.18500: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882799.1077864-23072-144458974125080/AnsiballZ_stat.py" <<< 22286 1726882799.18660: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-22286nvkga28q/tmp7w_t8fyb" to remote "/root/.ansible/tmp/ansible-tmp-1726882799.1077864-23072-144458974125080/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882799.1077864-23072-144458974125080/AnsiballZ_stat.py" <<< 22286 1726882799.21093: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882799.21128: stderr chunk (state=3): >>><<< 22286 1726882799.21141: stdout chunk (state=3): >>><<< 22286 1726882799.21170: done transferring module to remote 22286 1726882799.21207: _low_level_execute_command(): starting 22286 1726882799.21383: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882799.1077864-23072-144458974125080/ /root/.ansible/tmp/ansible-tmp-1726882799.1077864-23072-144458974125080/AnsiballZ_stat.py && sleep 0' 22286 1726882799.22397: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882799.22412: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882799.22642: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882799.22852: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882799.22998: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882799.25050: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882799.25246: stderr chunk (state=3): >>><<< 22286 1726882799.25265: stdout chunk (state=3): >>><<< 22286 1726882799.25364: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882799.25372: _low_level_execute_command(): starting 22286 1726882799.25374: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882799.1077864-23072-144458974125080/AnsiballZ_stat.py && sleep 0' 22286 1726882799.26572: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882799.26586: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882799.26649: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882799.26707: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882799.26953: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882799.26971: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882799.27242: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882799.44478: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/veth0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 38397, "dev": 23, "nlink": 1, "atime": 1726882786.0343666, "mtime": 1726882786.0343666, "ctime": 1726882786.0343666, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/veth0", "lnk_target": "../../devices/virtual/net/veth0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/veth0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 22286 1726882799.45974: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882799.46055: stderr chunk (state=3): >>>Shared connection to 10.31.41.238 closed. <<< 22286 1726882799.46187: stderr chunk (state=3): >>><<< 22286 1726882799.46199: stdout chunk (state=3): >>><<< 22286 1726882799.46221: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/veth0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 38397, "dev": 23, "nlink": 1, "atime": 1726882786.0343666, "mtime": 1726882786.0343666, "ctime": 1726882786.0343666, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/veth0", "lnk_target": "../../devices/virtual/net/veth0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/veth0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. 22286 1726882799.46304: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/veth0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882799.1077864-23072-144458974125080/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22286 1726882799.46317: _low_level_execute_command(): starting 22286 1726882799.46324: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882799.1077864-23072-144458974125080/ > /dev/null 2>&1 && sleep 0' 22286 1726882799.47694: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882799.47924: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882799.48158: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882799.48303: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882799.50459: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882799.50562: stderr chunk (state=3): >>><<< 22286 1726882799.50788: stdout chunk (state=3): >>><<< 22286 1726882799.50792: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882799.50799: handler run complete 22286 1726882799.50802: attempt loop complete, returning result 22286 1726882799.50804: _execute() done 22286 1726882799.50806: dumping result to json 22286 1726882799.50808: done dumping result, returning 22286 1726882799.50900: done running TaskExecutor() for managed_node3/TASK: Get stat for interface veth0 [0affe814-3a2d-a75d-4836-0000000003a0] 22286 1726882799.50914: sending task result for task 0affe814-3a2d-a75d-4836-0000000003a0 22286 1726882799.51241: done sending task result for task 0affe814-3a2d-a75d-4836-0000000003a0 22286 1726882799.51245: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "atime": 1726882786.0343666, "block_size": 4096, "blocks": 0, "ctime": 1726882786.0343666, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 38397, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/veth0", "lnk_target": "../../devices/virtual/net/veth0", "mode": "0777", "mtime": 1726882786.0343666, "nlink": 1, "path": "/sys/class/net/veth0", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 22286 1726882799.51387: no more pending results, returning what we have 22286 1726882799.51392: results queue empty 22286 1726882799.51393: checking for any_errors_fatal 22286 1726882799.51396: done checking for any_errors_fatal 22286 1726882799.51397: checking for max_fail_percentage 22286 1726882799.51399: done checking for max_fail_percentage 22286 1726882799.51401: checking to see if all hosts have failed and the running result is not ok 22286 1726882799.51402: done checking to see if all hosts have failed 22286 1726882799.51403: getting the remaining hosts for this loop 22286 1726882799.51405: done getting the remaining hosts for this loop 22286 1726882799.51410: getting the next task for host managed_node3 22286 1726882799.51421: done getting next task for host managed_node3 22286 1726882799.51424: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 22286 1726882799.51429: ^ state is: HOST STATE: block=3, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882799.51872: getting variables 22286 1726882799.51874: in VariableManager get_vars() 22286 1726882799.51915: Calling all_inventory to load vars for managed_node3 22286 1726882799.51919: Calling groups_inventory to load vars for managed_node3 22286 1726882799.51922: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882799.51936: Calling all_plugins_play to load vars for managed_node3 22286 1726882799.51940: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882799.51944: Calling groups_plugins_play to load vars for managed_node3 22286 1726882799.56688: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882799.62458: done with get_vars() 22286 1726882799.62506: done getting variables 22286 1726882799.62830: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 22286 1726882799.63176: variable 'interface' from source: play vars TASK [Assert that the interface is present - 'veth0'] ************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 21:39:59 -0400 (0:00:00.574) 0:00:23.025 ****** 22286 1726882799.63210: entering _queue_task() for managed_node3/assert 22286 1726882799.63213: Creating lock for assert 22286 1726882799.64000: worker is 1 (out of 1 available) 22286 1726882799.64014: exiting _queue_task() for managed_node3/assert 22286 1726882799.64028: done queuing things up, now waiting for results queue to drain 22286 1726882799.64029: waiting for pending results... 22286 1726882799.64654: running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'veth0' 22286 1726882799.65040: in run() - task 0affe814-3a2d-a75d-4836-0000000002b6 22286 1726882799.65044: variable 'ansible_search_path' from source: unknown 22286 1726882799.65047: variable 'ansible_search_path' from source: unknown 22286 1726882799.65050: calling self._execute() 22286 1726882799.65052: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882799.65055: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882799.65058: variable 'omit' from source: magic vars 22286 1726882799.66027: variable 'ansible_distribution_major_version' from source: facts 22286 1726882799.66155: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882799.66170: variable 'omit' from source: magic vars 22286 1726882799.66227: variable 'omit' from source: magic vars 22286 1726882799.66351: variable 'interface' from source: play vars 22286 1726882799.66377: variable 'omit' from source: magic vars 22286 1726882799.66425: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22286 1726882799.66485: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22286 1726882799.66516: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22286 1726882799.66547: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882799.66596: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882799.66635: variable 'inventory_hostname' from source: host vars for 'managed_node3' 22286 1726882799.66644: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882799.66652: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882799.66791: Set connection var ansible_shell_executable to /bin/sh 22286 1726882799.66810: Set connection var ansible_module_compression to ZIP_DEFLATED 22286 1726882799.66817: Set connection var ansible_connection to ssh 22286 1726882799.66824: Set connection var ansible_shell_type to sh 22286 1726882799.66837: Set connection var ansible_timeout to 10 22286 1726882799.66852: Set connection var ansible_pipelining to False 22286 1726882799.66890: variable 'ansible_shell_executable' from source: unknown 22286 1726882799.66900: variable 'ansible_connection' from source: unknown 22286 1726882799.66913: variable 'ansible_module_compression' from source: unknown 22286 1726882799.66921: variable 'ansible_shell_type' from source: unknown 22286 1726882799.66928: variable 'ansible_shell_executable' from source: unknown 22286 1726882799.66938: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882799.67031: variable 'ansible_pipelining' from source: unknown 22286 1726882799.67037: variable 'ansible_timeout' from source: unknown 22286 1726882799.67039: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882799.67171: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22286 1726882799.67192: variable 'omit' from source: magic vars 22286 1726882799.67202: starting attempt loop 22286 1726882799.67213: running the handler 22286 1726882799.67400: variable 'interface_stat' from source: set_fact 22286 1726882799.67428: Evaluated conditional (interface_stat.stat.exists): True 22286 1726882799.67462: handler run complete 22286 1726882799.67489: attempt loop complete, returning result 22286 1726882799.67498: _execute() done 22286 1726882799.67506: dumping result to json 22286 1726882799.67515: done dumping result, returning 22286 1726882799.67562: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'veth0' [0affe814-3a2d-a75d-4836-0000000002b6] 22286 1726882799.67566: sending task result for task 0affe814-3a2d-a75d-4836-0000000002b6 ok: [managed_node3] => { "changed": false } MSG: All assertions passed 22286 1726882799.67703: no more pending results, returning what we have 22286 1726882799.67707: results queue empty 22286 1726882799.67708: checking for any_errors_fatal 22286 1726882799.67718: done checking for any_errors_fatal 22286 1726882799.67719: checking for max_fail_percentage 22286 1726882799.67722: done checking for max_fail_percentage 22286 1726882799.67723: checking to see if all hosts have failed and the running result is not ok 22286 1726882799.67724: done checking to see if all hosts have failed 22286 1726882799.67724: getting the remaining hosts for this loop 22286 1726882799.67726: done getting the remaining hosts for this loop 22286 1726882799.67731: getting the next task for host managed_node3 22286 1726882799.67742: done getting next task for host managed_node3 22286 1726882799.67746: ^ task is: TASK: Include the task 'assert_profile_present.yml' 22286 1726882799.67749: ^ state is: HOST STATE: block=3, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882799.67754: getting variables 22286 1726882799.67756: in VariableManager get_vars() 22286 1726882799.67801: Calling all_inventory to load vars for managed_node3 22286 1726882799.67804: Calling groups_inventory to load vars for managed_node3 22286 1726882799.67807: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882799.67821: Calling all_plugins_play to load vars for managed_node3 22286 1726882799.67825: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882799.67829: Calling groups_plugins_play to load vars for managed_node3 22286 1726882799.68752: done sending task result for task 0affe814-3a2d-a75d-4836-0000000002b6 22286 1726882799.68755: WORKER PROCESS EXITING 22286 1726882799.71863: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882799.76963: done with get_vars() 22286 1726882799.77003: done getting variables TASK [Include the task 'assert_profile_present.yml'] *************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:49 Friday 20 September 2024 21:39:59 -0400 (0:00:00.139) 0:00:23.164 ****** 22286 1726882799.77118: entering _queue_task() for managed_node3/include_tasks 22286 1726882799.77482: worker is 1 (out of 1 available) 22286 1726882799.77495: exiting _queue_task() for managed_node3/include_tasks 22286 1726882799.77509: done queuing things up, now waiting for results queue to drain 22286 1726882799.77510: waiting for pending results... 22286 1726882799.77803: running TaskExecutor() for managed_node3/TASK: Include the task 'assert_profile_present.yml' 22286 1726882799.77921: in run() - task 0affe814-3a2d-a75d-4836-00000000005d 22286 1726882799.77943: variable 'ansible_search_path' from source: unknown 22286 1726882799.77992: calling self._execute() 22286 1726882799.78141: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882799.78155: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882799.78342: variable 'omit' from source: magic vars 22286 1726882799.78897: variable 'ansible_distribution_major_version' from source: facts 22286 1726882799.78916: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882799.78926: _execute() done 22286 1726882799.78937: dumping result to json 22286 1726882799.78948: done dumping result, returning 22286 1726882799.78958: done running TaskExecutor() for managed_node3/TASK: Include the task 'assert_profile_present.yml' [0affe814-3a2d-a75d-4836-00000000005d] 22286 1726882799.78969: sending task result for task 0affe814-3a2d-a75d-4836-00000000005d 22286 1726882799.79198: no more pending results, returning what we have 22286 1726882799.79205: in VariableManager get_vars() 22286 1726882799.79261: Calling all_inventory to load vars for managed_node3 22286 1726882799.79265: Calling groups_inventory to load vars for managed_node3 22286 1726882799.79268: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882799.79285: Calling all_plugins_play to load vars for managed_node3 22286 1726882799.79289: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882799.79293: Calling groups_plugins_play to load vars for managed_node3 22286 1726882799.80140: done sending task result for task 0affe814-3a2d-a75d-4836-00000000005d 22286 1726882799.80144: WORKER PROCESS EXITING 22286 1726882799.82328: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882799.85144: done with get_vars() 22286 1726882799.85193: variable 'ansible_search_path' from source: unknown 22286 1726882799.85209: we have included files to process 22286 1726882799.85211: generating all_blocks data 22286 1726882799.85213: done generating all_blocks data 22286 1726882799.85217: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 22286 1726882799.85219: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 22286 1726882799.85221: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 22286 1726882799.85463: in VariableManager get_vars() 22286 1726882799.85491: done with get_vars() 22286 1726882799.85813: done processing included file 22286 1726882799.85816: iterating over new_blocks loaded from include file 22286 1726882799.85817: in VariableManager get_vars() 22286 1726882799.85846: done with get_vars() 22286 1726882799.85848: filtering new block on tags 22286 1726882799.85873: done filtering new block on tags 22286 1726882799.85876: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node3 22286 1726882799.85882: extending task lists for all hosts with included blocks 22286 1726882799.89728: done extending task lists 22286 1726882799.89730: done processing included files 22286 1726882799.89730: results queue empty 22286 1726882799.89731: checking for any_errors_fatal 22286 1726882799.89735: done checking for any_errors_fatal 22286 1726882799.89737: checking for max_fail_percentage 22286 1726882799.89738: done checking for max_fail_percentage 22286 1726882799.89739: checking to see if all hosts have failed and the running result is not ok 22286 1726882799.89739: done checking to see if all hosts have failed 22286 1726882799.89740: getting the remaining hosts for this loop 22286 1726882799.89741: done getting the remaining hosts for this loop 22286 1726882799.89743: getting the next task for host managed_node3 22286 1726882799.89746: done getting next task for host managed_node3 22286 1726882799.89748: ^ task is: TASK: Include the task 'get_profile_stat.yml' 22286 1726882799.89750: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882799.89752: getting variables 22286 1726882799.89753: in VariableManager get_vars() 22286 1726882799.89765: Calling all_inventory to load vars for managed_node3 22286 1726882799.89767: Calling groups_inventory to load vars for managed_node3 22286 1726882799.89769: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882799.89774: Calling all_plugins_play to load vars for managed_node3 22286 1726882799.89778: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882799.89780: Calling groups_plugins_play to load vars for managed_node3 22286 1726882799.91206: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882799.93001: done with get_vars() 22286 1726882799.93022: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 21:39:59 -0400 (0:00:00.159) 0:00:23.324 ****** 22286 1726882799.93085: entering _queue_task() for managed_node3/include_tasks 22286 1726882799.93356: worker is 1 (out of 1 available) 22286 1726882799.93370: exiting _queue_task() for managed_node3/include_tasks 22286 1726882799.93383: done queuing things up, now waiting for results queue to drain 22286 1726882799.93385: waiting for pending results... 22286 1726882799.93575: running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' 22286 1726882799.93651: in run() - task 0affe814-3a2d-a75d-4836-0000000003b8 22286 1726882799.93662: variable 'ansible_search_path' from source: unknown 22286 1726882799.93666: variable 'ansible_search_path' from source: unknown 22286 1726882799.93702: calling self._execute() 22286 1726882799.93787: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882799.93794: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882799.93805: variable 'omit' from source: magic vars 22286 1726882799.94259: variable 'ansible_distribution_major_version' from source: facts 22286 1726882799.94262: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882799.94273: _execute() done 22286 1726882799.94283: dumping result to json 22286 1726882799.94369: done dumping result, returning 22286 1726882799.94372: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' [0affe814-3a2d-a75d-4836-0000000003b8] 22286 1726882799.94375: sending task result for task 0affe814-3a2d-a75d-4836-0000000003b8 22286 1726882799.94448: done sending task result for task 0affe814-3a2d-a75d-4836-0000000003b8 22286 1726882799.94451: WORKER PROCESS EXITING 22286 1726882799.94504: no more pending results, returning what we have 22286 1726882799.94509: in VariableManager get_vars() 22286 1726882799.94569: Calling all_inventory to load vars for managed_node3 22286 1726882799.94573: Calling groups_inventory to load vars for managed_node3 22286 1726882799.94576: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882799.94593: Calling all_plugins_play to load vars for managed_node3 22286 1726882799.94596: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882799.94600: Calling groups_plugins_play to load vars for managed_node3 22286 1726882799.96241: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882799.98894: done with get_vars() 22286 1726882799.98925: variable 'ansible_search_path' from source: unknown 22286 1726882799.98926: variable 'ansible_search_path' from source: unknown 22286 1726882799.98977: we have included files to process 22286 1726882799.98978: generating all_blocks data 22286 1726882799.98980: done generating all_blocks data 22286 1726882799.98982: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 22286 1726882799.98983: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 22286 1726882799.98986: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 22286 1726882800.00520: done processing included file 22286 1726882800.00523: iterating over new_blocks loaded from include file 22286 1726882800.00524: in VariableManager get_vars() 22286 1726882800.00550: done with get_vars() 22286 1726882800.00552: filtering new block on tags 22286 1726882800.00583: done filtering new block on tags 22286 1726882800.00586: in VariableManager get_vars() 22286 1726882800.00609: done with get_vars() 22286 1726882800.00612: filtering new block on tags 22286 1726882800.00719: done filtering new block on tags 22286 1726882800.00722: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node3 22286 1726882800.00729: extending task lists for all hosts with included blocks 22286 1726882800.01009: done extending task lists 22286 1726882800.01010: done processing included files 22286 1726882800.01011: results queue empty 22286 1726882800.01012: checking for any_errors_fatal 22286 1726882800.01016: done checking for any_errors_fatal 22286 1726882800.01018: checking for max_fail_percentage 22286 1726882800.01020: done checking for max_fail_percentage 22286 1726882800.01021: checking to see if all hosts have failed and the running result is not ok 22286 1726882800.01022: done checking to see if all hosts have failed 22286 1726882800.01023: getting the remaining hosts for this loop 22286 1726882800.01024: done getting the remaining hosts for this loop 22286 1726882800.01028: getting the next task for host managed_node3 22286 1726882800.01033: done getting next task for host managed_node3 22286 1726882800.01041: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 22286 1726882800.01045: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882800.01048: getting variables 22286 1726882800.01049: in VariableManager get_vars() 22286 1726882800.01140: Calling all_inventory to load vars for managed_node3 22286 1726882800.01143: Calling groups_inventory to load vars for managed_node3 22286 1726882800.01146: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882800.01153: Calling all_plugins_play to load vars for managed_node3 22286 1726882800.01156: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882800.01160: Calling groups_plugins_play to load vars for managed_node3 22286 1726882800.03493: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882800.07008: done with get_vars() 22286 1726882800.07045: done getting variables 22286 1726882800.07107: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 21:40:00 -0400 (0:00:00.140) 0:00:23.464 ****** 22286 1726882800.07146: entering _queue_task() for managed_node3/set_fact 22286 1726882800.07652: worker is 1 (out of 1 available) 22286 1726882800.07663: exiting _queue_task() for managed_node3/set_fact 22286 1726882800.07678: done queuing things up, now waiting for results queue to drain 22286 1726882800.07680: waiting for pending results... 22286 1726882800.08254: running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag 22286 1726882800.08259: in run() - task 0affe814-3a2d-a75d-4836-0000000004b0 22286 1726882800.08265: variable 'ansible_search_path' from source: unknown 22286 1726882800.08269: variable 'ansible_search_path' from source: unknown 22286 1726882800.08272: calling self._execute() 22286 1726882800.08275: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882800.08278: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882800.08281: variable 'omit' from source: magic vars 22286 1726882800.08739: variable 'ansible_distribution_major_version' from source: facts 22286 1726882800.08756: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882800.08763: variable 'omit' from source: magic vars 22286 1726882800.08823: variable 'omit' from source: magic vars 22286 1726882800.08877: variable 'omit' from source: magic vars 22286 1726882800.08925: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22286 1726882800.08989: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22286 1726882800.09014: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22286 1726882800.09044: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882800.09056: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882800.09104: variable 'inventory_hostname' from source: host vars for 'managed_node3' 22286 1726882800.09108: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882800.09113: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882800.09263: Set connection var ansible_shell_executable to /bin/sh 22286 1726882800.09273: Set connection var ansible_module_compression to ZIP_DEFLATED 22286 1726882800.09286: Set connection var ansible_connection to ssh 22286 1726882800.09289: Set connection var ansible_shell_type to sh 22286 1726882800.09306: Set connection var ansible_timeout to 10 22286 1726882800.09317: Set connection var ansible_pipelining to False 22286 1726882800.09348: variable 'ansible_shell_executable' from source: unknown 22286 1726882800.09351: variable 'ansible_connection' from source: unknown 22286 1726882800.09355: variable 'ansible_module_compression' from source: unknown 22286 1726882800.09357: variable 'ansible_shell_type' from source: unknown 22286 1726882800.09360: variable 'ansible_shell_executable' from source: unknown 22286 1726882800.09366: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882800.09371: variable 'ansible_pipelining' from source: unknown 22286 1726882800.09375: variable 'ansible_timeout' from source: unknown 22286 1726882800.09389: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882800.09575: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22286 1726882800.09738: variable 'omit' from source: magic vars 22286 1726882800.09741: starting attempt loop 22286 1726882800.09744: running the handler 22286 1726882800.09746: handler run complete 22286 1726882800.09748: attempt loop complete, returning result 22286 1726882800.09750: _execute() done 22286 1726882800.09752: dumping result to json 22286 1726882800.09753: done dumping result, returning 22286 1726882800.09755: done running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag [0affe814-3a2d-a75d-4836-0000000004b0] 22286 1726882800.09757: sending task result for task 0affe814-3a2d-a75d-4836-0000000004b0 22286 1726882800.09818: done sending task result for task 0affe814-3a2d-a75d-4836-0000000004b0 22286 1726882800.09822: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 22286 1726882800.09913: no more pending results, returning what we have 22286 1726882800.09918: results queue empty 22286 1726882800.09919: checking for any_errors_fatal 22286 1726882800.09921: done checking for any_errors_fatal 22286 1726882800.09922: checking for max_fail_percentage 22286 1726882800.09930: done checking for max_fail_percentage 22286 1726882800.09932: checking to see if all hosts have failed and the running result is not ok 22286 1726882800.09933: done checking to see if all hosts have failed 22286 1726882800.09934: getting the remaining hosts for this loop 22286 1726882800.09938: done getting the remaining hosts for this loop 22286 1726882800.09943: getting the next task for host managed_node3 22286 1726882800.09952: done getting next task for host managed_node3 22286 1726882800.09956: ^ task is: TASK: Stat profile file 22286 1726882800.09962: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882800.09968: getting variables 22286 1726882800.09970: in VariableManager get_vars() 22286 1726882800.10018: Calling all_inventory to load vars for managed_node3 22286 1726882800.10021: Calling groups_inventory to load vars for managed_node3 22286 1726882800.10025: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882800.10233: Calling all_plugins_play to load vars for managed_node3 22286 1726882800.10240: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882800.10249: Calling groups_plugins_play to load vars for managed_node3 22286 1726882800.12592: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882800.15654: done with get_vars() 22286 1726882800.15692: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 21:40:00 -0400 (0:00:00.086) 0:00:23.551 ****** 22286 1726882800.15809: entering _queue_task() for managed_node3/stat 22286 1726882800.16153: worker is 1 (out of 1 available) 22286 1726882800.16167: exiting _queue_task() for managed_node3/stat 22286 1726882800.16185: done queuing things up, now waiting for results queue to drain 22286 1726882800.16186: waiting for pending results... 22286 1726882800.16496: running TaskExecutor() for managed_node3/TASK: Stat profile file 22286 1726882800.16626: in run() - task 0affe814-3a2d-a75d-4836-0000000004b1 22286 1726882800.16646: variable 'ansible_search_path' from source: unknown 22286 1726882800.16649: variable 'ansible_search_path' from source: unknown 22286 1726882800.16696: calling self._execute() 22286 1726882800.16940: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882800.16945: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882800.16948: variable 'omit' from source: magic vars 22286 1726882800.17331: variable 'ansible_distribution_major_version' from source: facts 22286 1726882800.17347: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882800.17354: variable 'omit' from source: magic vars 22286 1726882800.17416: variable 'omit' from source: magic vars 22286 1726882800.17555: variable 'profile' from source: include params 22286 1726882800.17562: variable 'interface' from source: play vars 22286 1726882800.17657: variable 'interface' from source: play vars 22286 1726882800.17682: variable 'omit' from source: magic vars 22286 1726882800.17729: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22286 1726882800.17784: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22286 1726882800.17806: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22286 1726882800.17829: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882800.17847: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882800.17894: variable 'inventory_hostname' from source: host vars for 'managed_node3' 22286 1726882800.17898: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882800.17904: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882800.18047: Set connection var ansible_shell_executable to /bin/sh 22286 1726882800.18057: Set connection var ansible_module_compression to ZIP_DEFLATED 22286 1726882800.18061: Set connection var ansible_connection to ssh 22286 1726882800.18139: Set connection var ansible_shell_type to sh 22286 1726882800.18142: Set connection var ansible_timeout to 10 22286 1726882800.18145: Set connection var ansible_pipelining to False 22286 1726882800.18148: variable 'ansible_shell_executable' from source: unknown 22286 1726882800.18150: variable 'ansible_connection' from source: unknown 22286 1726882800.18153: variable 'ansible_module_compression' from source: unknown 22286 1726882800.18155: variable 'ansible_shell_type' from source: unknown 22286 1726882800.18157: variable 'ansible_shell_executable' from source: unknown 22286 1726882800.18159: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882800.18161: variable 'ansible_pipelining' from source: unknown 22286 1726882800.18164: variable 'ansible_timeout' from source: unknown 22286 1726882800.18166: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882800.18419: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 22286 1726882800.18431: variable 'omit' from source: magic vars 22286 1726882800.18439: starting attempt loop 22286 1726882800.18443: running the handler 22286 1726882800.18459: _low_level_execute_command(): starting 22286 1726882800.18470: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22286 1726882800.19336: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882800.19539: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882800.19545: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882800.19548: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882800.19613: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882800.21402: stdout chunk (state=3): >>>/root <<< 22286 1726882800.21517: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882800.21604: stderr chunk (state=3): >>><<< 22286 1726882800.21627: stdout chunk (state=3): >>><<< 22286 1726882800.21649: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882800.21754: _low_level_execute_command(): starting 22286 1726882800.21759: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882800.2165625-23134-236547022492319 `" && echo ansible-tmp-1726882800.2165625-23134-236547022492319="` echo /root/.ansible/tmp/ansible-tmp-1726882800.2165625-23134-236547022492319 `" ) && sleep 0' 22286 1726882800.22336: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882800.22353: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882800.22370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882800.22493: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882800.22544: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882800.22673: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882800.24817: stdout chunk (state=3): >>>ansible-tmp-1726882800.2165625-23134-236547022492319=/root/.ansible/tmp/ansible-tmp-1726882800.2165625-23134-236547022492319 <<< 22286 1726882800.24949: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882800.25033: stderr chunk (state=3): >>><<< 22286 1726882800.25038: stdout chunk (state=3): >>><<< 22286 1726882800.25239: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882800.2165625-23134-236547022492319=/root/.ansible/tmp/ansible-tmp-1726882800.2165625-23134-236547022492319 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882800.25243: variable 'ansible_module_compression' from source: unknown 22286 1726882800.25245: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22286nvkga28q/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 22286 1726882800.25248: variable 'ansible_facts' from source: unknown 22286 1726882800.25311: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882800.2165625-23134-236547022492319/AnsiballZ_stat.py 22286 1726882800.25456: Sending initial data 22286 1726882800.25589: Sent initial data (153 bytes) 22286 1726882800.26142: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882800.26258: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882800.26280: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882800.26298: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882800.26320: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882800.26468: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882800.28229: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22286 1726882800.28337: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22286 1726882800.28558: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22286nvkga28q/tmpp1y7werd /root/.ansible/tmp/ansible-tmp-1726882800.2165625-23134-236547022492319/AnsiballZ_stat.py <<< 22286 1726882800.28561: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882800.2165625-23134-236547022492319/AnsiballZ_stat.py" <<< 22286 1726882800.28593: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-22286nvkga28q/tmpp1y7werd" to remote "/root/.ansible/tmp/ansible-tmp-1726882800.2165625-23134-236547022492319/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882800.2165625-23134-236547022492319/AnsiballZ_stat.py" <<< 22286 1726882800.30896: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882800.30950: stderr chunk (state=3): >>><<< 22286 1726882800.30965: stdout chunk (state=3): >>><<< 22286 1726882800.30998: done transferring module to remote 22286 1726882800.31015: _low_level_execute_command(): starting 22286 1726882800.31025: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882800.2165625-23134-236547022492319/ /root/.ansible/tmp/ansible-tmp-1726882800.2165625-23134-236547022492319/AnsiballZ_stat.py && sleep 0' 22286 1726882800.31633: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882800.31750: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882800.31756: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882800.31786: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882800.31808: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882800.31825: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882800.31975: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882800.34113: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882800.34123: stdout chunk (state=3): >>><<< 22286 1726882800.34137: stderr chunk (state=3): >>><<< 22286 1726882800.34254: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882800.34337: _low_level_execute_command(): starting 22286 1726882800.34345: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882800.2165625-23134-236547022492319/AnsiballZ_stat.py && sleep 0' 22286 1726882800.35448: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 22286 1726882800.35660: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882800.35670: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882800.35818: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882800.54329: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-veth0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 22286 1726882800.55843: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. <<< 22286 1726882800.55847: stdout chunk (state=3): >>><<< 22286 1726882800.55850: stderr chunk (state=3): >>><<< 22286 1726882800.55853: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-veth0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. 22286 1726882800.55856: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-veth0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882800.2165625-23134-236547022492319/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22286 1726882800.55859: _low_level_execute_command(): starting 22286 1726882800.55861: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882800.2165625-23134-236547022492319/ > /dev/null 2>&1 && sleep 0' 22286 1726882800.57243: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 22286 1726882800.57449: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882800.57693: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882800.59766: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882800.60036: stderr chunk (state=3): >>><<< 22286 1726882800.60042: stdout chunk (state=3): >>><<< 22286 1726882800.60046: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882800.60048: handler run complete 22286 1726882800.60051: attempt loop complete, returning result 22286 1726882800.60053: _execute() done 22286 1726882800.60055: dumping result to json 22286 1726882800.60057: done dumping result, returning 22286 1726882800.60060: done running TaskExecutor() for managed_node3/TASK: Stat profile file [0affe814-3a2d-a75d-4836-0000000004b1] 22286 1726882800.60062: sending task result for task 0affe814-3a2d-a75d-4836-0000000004b1 ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 22286 1726882800.60510: no more pending results, returning what we have 22286 1726882800.60514: results queue empty 22286 1726882800.60520: checking for any_errors_fatal 22286 1726882800.60526: done checking for any_errors_fatal 22286 1726882800.60527: checking for max_fail_percentage 22286 1726882800.60530: done checking for max_fail_percentage 22286 1726882800.60531: checking to see if all hosts have failed and the running result is not ok 22286 1726882800.60532: done checking to see if all hosts have failed 22286 1726882800.60533: getting the remaining hosts for this loop 22286 1726882800.60537: done getting the remaining hosts for this loop 22286 1726882800.60542: getting the next task for host managed_node3 22286 1726882800.60549: done getting next task for host managed_node3 22286 1726882800.60553: ^ task is: TASK: Set NM profile exist flag based on the profile files 22286 1726882800.60557: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882800.60562: getting variables 22286 1726882800.60564: in VariableManager get_vars() 22286 1726882800.60614: Calling all_inventory to load vars for managed_node3 22286 1726882800.60617: Calling groups_inventory to load vars for managed_node3 22286 1726882800.60621: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882800.60633: Calling all_plugins_play to load vars for managed_node3 22286 1726882800.60842: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882800.60849: Calling groups_plugins_play to load vars for managed_node3 22286 1726882800.61940: done sending task result for task 0affe814-3a2d-a75d-4836-0000000004b1 22286 1726882800.61944: WORKER PROCESS EXITING 22286 1726882800.66291: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882800.72655: done with get_vars() 22286 1726882800.72698: done getting variables 22286 1726882800.72772: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 21:40:00 -0400 (0:00:00.569) 0:00:24.121 ****** 22286 1726882800.72811: entering _queue_task() for managed_node3/set_fact 22286 1726882800.73992: worker is 1 (out of 1 available) 22286 1726882800.74005: exiting _queue_task() for managed_node3/set_fact 22286 1726882800.74021: done queuing things up, now waiting for results queue to drain 22286 1726882800.74022: waiting for pending results... 22286 1726882800.74539: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files 22286 1726882800.74901: in run() - task 0affe814-3a2d-a75d-4836-0000000004b2 22286 1726882800.74926: variable 'ansible_search_path' from source: unknown 22286 1726882800.75241: variable 'ansible_search_path' from source: unknown 22286 1726882800.75245: calling self._execute() 22286 1726882800.75501: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882800.75517: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882800.75538: variable 'omit' from source: magic vars 22286 1726882800.76810: variable 'ansible_distribution_major_version' from source: facts 22286 1726882800.77439: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882800.77443: variable 'profile_stat' from source: set_fact 22286 1726882800.77446: Evaluated conditional (profile_stat.stat.exists): False 22286 1726882800.78039: when evaluation is False, skipping this task 22286 1726882800.78043: _execute() done 22286 1726882800.78046: dumping result to json 22286 1726882800.78048: done dumping result, returning 22286 1726882800.78051: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files [0affe814-3a2d-a75d-4836-0000000004b2] 22286 1726882800.78053: sending task result for task 0affe814-3a2d-a75d-4836-0000000004b2 22286 1726882800.78131: done sending task result for task 0affe814-3a2d-a75d-4836-0000000004b2 skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 22286 1726882800.78195: no more pending results, returning what we have 22286 1726882800.78202: results queue empty 22286 1726882800.78203: checking for any_errors_fatal 22286 1726882800.78217: done checking for any_errors_fatal 22286 1726882800.78218: checking for max_fail_percentage 22286 1726882800.78221: done checking for max_fail_percentage 22286 1726882800.78223: checking to see if all hosts have failed and the running result is not ok 22286 1726882800.78224: done checking to see if all hosts have failed 22286 1726882800.78225: getting the remaining hosts for this loop 22286 1726882800.78227: done getting the remaining hosts for this loop 22286 1726882800.78232: getting the next task for host managed_node3 22286 1726882800.78243: done getting next task for host managed_node3 22286 1726882800.78246: ^ task is: TASK: Get NM profile info 22286 1726882800.78253: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882800.78260: getting variables 22286 1726882800.78262: in VariableManager get_vars() 22286 1726882800.78315: Calling all_inventory to load vars for managed_node3 22286 1726882800.78319: Calling groups_inventory to load vars for managed_node3 22286 1726882800.78322: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882800.78555: Calling all_plugins_play to load vars for managed_node3 22286 1726882800.78560: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882800.78565: Calling groups_plugins_play to load vars for managed_node3 22286 1726882800.79972: WORKER PROCESS EXITING 22286 1726882800.84196: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882800.90154: done with get_vars() 22286 1726882800.90195: done getting variables 22286 1726882800.90474: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 21:40:00 -0400 (0:00:00.176) 0:00:24.298 ****** 22286 1726882800.90512: entering _queue_task() for managed_node3/shell 22286 1726882800.91087: worker is 1 (out of 1 available) 22286 1726882800.91101: exiting _queue_task() for managed_node3/shell 22286 1726882800.91115: done queuing things up, now waiting for results queue to drain 22286 1726882800.91116: waiting for pending results... 22286 1726882800.91630: running TaskExecutor() for managed_node3/TASK: Get NM profile info 22286 1726882800.91965: in run() - task 0affe814-3a2d-a75d-4836-0000000004b3 22286 1726882800.91982: variable 'ansible_search_path' from source: unknown 22286 1726882800.91986: variable 'ansible_search_path' from source: unknown 22286 1726882800.92027: calling self._execute() 22286 1726882800.92540: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882800.92545: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882800.92548: variable 'omit' from source: magic vars 22286 1726882800.93307: variable 'ansible_distribution_major_version' from source: facts 22286 1726882800.93321: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882800.93329: variable 'omit' from source: magic vars 22286 1726882800.93481: variable 'omit' from source: magic vars 22286 1726882800.93939: variable 'profile' from source: include params 22286 1726882800.93943: variable 'interface' from source: play vars 22286 1726882800.93945: variable 'interface' from source: play vars 22286 1726882800.94028: variable 'omit' from source: magic vars 22286 1726882800.94078: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22286 1726882800.94118: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22286 1726882800.94264: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22286 1726882800.94285: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882800.94299: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882800.94336: variable 'inventory_hostname' from source: host vars for 'managed_node3' 22286 1726882800.94459: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882800.94463: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882800.94682: Set connection var ansible_shell_executable to /bin/sh 22286 1726882800.94694: Set connection var ansible_module_compression to ZIP_DEFLATED 22286 1726882800.94697: Set connection var ansible_connection to ssh 22286 1726882800.94699: Set connection var ansible_shell_type to sh 22286 1726882800.94708: Set connection var ansible_timeout to 10 22286 1726882800.94719: Set connection var ansible_pipelining to False 22286 1726882800.94792: variable 'ansible_shell_executable' from source: unknown 22286 1726882800.94796: variable 'ansible_connection' from source: unknown 22286 1726882800.94799: variable 'ansible_module_compression' from source: unknown 22286 1726882800.94804: variable 'ansible_shell_type' from source: unknown 22286 1726882800.94807: variable 'ansible_shell_executable' from source: unknown 22286 1726882800.94811: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882800.94817: variable 'ansible_pipelining' from source: unknown 22286 1726882800.94821: variable 'ansible_timeout' from source: unknown 22286 1726882800.94826: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882800.95255: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22286 1726882800.95272: variable 'omit' from source: magic vars 22286 1726882800.95280: starting attempt loop 22286 1726882800.95283: running the handler 22286 1726882800.95296: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22286 1726882800.95318: _low_level_execute_command(): starting 22286 1726882800.95329: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22286 1726882800.96947: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882800.97241: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882800.97454: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882800.97789: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882800.99645: stdout chunk (state=3): >>>/root <<< 22286 1726882800.99750: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882801.00040: stderr chunk (state=3): >>><<< 22286 1726882801.00043: stdout chunk (state=3): >>><<< 22286 1726882801.00045: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882801.00048: _low_level_execute_command(): starting 22286 1726882801.00051: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882800.9998198-23157-262815624628194 `" && echo ansible-tmp-1726882800.9998198-23157-262815624628194="` echo /root/.ansible/tmp/ansible-tmp-1726882800.9998198-23157-262815624628194 `" ) && sleep 0' 22286 1726882801.01390: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882801.01563: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882801.01567: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882801.01589: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22286 1726882801.01592: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 <<< 22286 1726882801.01595: stderr chunk (state=3): >>>debug2: match not found <<< 22286 1726882801.01599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882801.01753: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882801.01895: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882801.04200: stdout chunk (state=3): >>>ansible-tmp-1726882800.9998198-23157-262815624628194=/root/.ansible/tmp/ansible-tmp-1726882800.9998198-23157-262815624628194 <<< 22286 1726882801.04208: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882801.04314: stderr chunk (state=3): >>><<< 22286 1726882801.04318: stdout chunk (state=3): >>><<< 22286 1726882801.04341: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882800.9998198-23157-262815624628194=/root/.ansible/tmp/ansible-tmp-1726882800.9998198-23157-262815624628194 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882801.04378: variable 'ansible_module_compression' from source: unknown 22286 1726882801.04432: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22286nvkga28q/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 22286 1726882801.04778: variable 'ansible_facts' from source: unknown 22286 1726882801.04859: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882800.9998198-23157-262815624628194/AnsiballZ_command.py 22286 1726882801.05113: Sending initial data 22286 1726882801.05117: Sent initial data (156 bytes) 22286 1726882801.06439: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882801.06443: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882801.06446: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882801.06448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22286 1726882801.06451: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 <<< 22286 1726882801.06454: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882801.06456: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882801.06463: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882801.06465: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882801.06824: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882801.08331: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22286 1726882801.08565: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22286 1726882801.08683: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22286nvkga28q/tmplceb7mu2 /root/.ansible/tmp/ansible-tmp-1726882800.9998198-23157-262815624628194/AnsiballZ_command.py <<< 22286 1726882801.08695: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882800.9998198-23157-262815624628194/AnsiballZ_command.py" <<< 22286 1726882801.09077: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-22286nvkga28q/tmplceb7mu2" to remote "/root/.ansible/tmp/ansible-tmp-1726882800.9998198-23157-262815624628194/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882800.9998198-23157-262815624628194/AnsiballZ_command.py" <<< 22286 1726882801.12257: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882801.12787: stderr chunk (state=3): >>><<< 22286 1726882801.12791: stdout chunk (state=3): >>><<< 22286 1726882801.12793: done transferring module to remote 22286 1726882801.12796: _low_level_execute_command(): starting 22286 1726882801.12798: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882800.9998198-23157-262815624628194/ /root/.ansible/tmp/ansible-tmp-1726882800.9998198-23157-262815624628194/AnsiballZ_command.py && sleep 0' 22286 1726882801.13882: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882801.13929: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882801.13943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882801.13961: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22286 1726882801.13974: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 <<< 22286 1726882801.14125: stderr chunk (state=3): >>>debug2: match not found <<< 22286 1726882801.14143: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882801.14255: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882801.14739: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882801.16415: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882801.16493: stderr chunk (state=3): >>><<< 22286 1726882801.16547: stdout chunk (state=3): >>><<< 22286 1726882801.16570: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882801.16573: _low_level_execute_command(): starting 22286 1726882801.16581: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882800.9998198-23157-262815624628194/AnsiballZ_command.py && sleep 0' 22286 1726882801.17849: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22286 1726882801.17871: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 <<< 22286 1726882801.17916: stderr chunk (state=3): >>>debug2: match not found <<< 22286 1726882801.17980: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882801.18086: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882801.18133: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882801.18149: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882801.18301: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882801.37792: stdout chunk (state=3): >>> {"changed": true, "stdout": "veth0 /etc/NetworkManager/system-connections/veth0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "start": "2024-09-20 21:40:01.354727", "end": "2024-09-20 21:40:01.374558", "delta": "0:00:00.019831", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 22286 1726882801.39363: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882801.39524: stderr chunk (state=3): >>>Shared connection to 10.31.41.238 closed. <<< 22286 1726882801.39527: stdout chunk (state=3): >>><<< 22286 1726882801.39540: stderr chunk (state=3): >>><<< 22286 1726882801.39563: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "veth0 /etc/NetworkManager/system-connections/veth0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "start": "2024-09-20 21:40:01.354727", "end": "2024-09-20 21:40:01.374558", "delta": "0:00:00.019831", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. 22286 1726882801.39614: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882800.9998198-23157-262815624628194/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22286 1726882801.39623: _low_level_execute_command(): starting 22286 1726882801.39632: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882800.9998198-23157-262815624628194/ > /dev/null 2>&1 && sleep 0' 22286 1726882801.41183: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882801.41255: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882801.41393: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882801.41397: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882801.41540: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882801.43593: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882801.43670: stderr chunk (state=3): >>><<< 22286 1726882801.43941: stdout chunk (state=3): >>><<< 22286 1726882801.43945: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882801.43947: handler run complete 22286 1726882801.43950: Evaluated conditional (False): False 22286 1726882801.43952: attempt loop complete, returning result 22286 1726882801.43954: _execute() done 22286 1726882801.43956: dumping result to json 22286 1726882801.43958: done dumping result, returning 22286 1726882801.43960: done running TaskExecutor() for managed_node3/TASK: Get NM profile info [0affe814-3a2d-a75d-4836-0000000004b3] 22286 1726882801.43962: sending task result for task 0affe814-3a2d-a75d-4836-0000000004b3 22286 1726882801.44208: done sending task result for task 0affe814-3a2d-a75d-4836-0000000004b3 22286 1726882801.44211: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "delta": "0:00:00.019831", "end": "2024-09-20 21:40:01.374558", "rc": 0, "start": "2024-09-20 21:40:01.354727" } STDOUT: veth0 /etc/NetworkManager/system-connections/veth0.nmconnection 22286 1726882801.44313: no more pending results, returning what we have 22286 1726882801.44317: results queue empty 22286 1726882801.44318: checking for any_errors_fatal 22286 1726882801.44325: done checking for any_errors_fatal 22286 1726882801.44326: checking for max_fail_percentage 22286 1726882801.44329: done checking for max_fail_percentage 22286 1726882801.44330: checking to see if all hosts have failed and the running result is not ok 22286 1726882801.44331: done checking to see if all hosts have failed 22286 1726882801.44332: getting the remaining hosts for this loop 22286 1726882801.44338: done getting the remaining hosts for this loop 22286 1726882801.44343: getting the next task for host managed_node3 22286 1726882801.44352: done getting next task for host managed_node3 22286 1726882801.44356: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 22286 1726882801.44361: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882801.44366: getting variables 22286 1726882801.44368: in VariableManager get_vars() 22286 1726882801.44416: Calling all_inventory to load vars for managed_node3 22286 1726882801.44419: Calling groups_inventory to load vars for managed_node3 22286 1726882801.44423: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882801.44840: Calling all_plugins_play to load vars for managed_node3 22286 1726882801.44845: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882801.44851: Calling groups_plugins_play to load vars for managed_node3 22286 1726882801.49361: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882801.56718: done with get_vars() 22286 1726882801.56963: done getting variables 22286 1726882801.57032: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 21:40:01 -0400 (0:00:00.665) 0:00:24.964 ****** 22286 1726882801.57070: entering _queue_task() for managed_node3/set_fact 22286 1726882801.58220: worker is 1 (out of 1 available) 22286 1726882801.58237: exiting _queue_task() for managed_node3/set_fact 22286 1726882801.58254: done queuing things up, now waiting for results queue to drain 22286 1726882801.58255: waiting for pending results... 22286 1726882801.59023: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 22286 1726882801.59741: in run() - task 0affe814-3a2d-a75d-4836-0000000004b4 22286 1726882801.59746: variable 'ansible_search_path' from source: unknown 22286 1726882801.59750: variable 'ansible_search_path' from source: unknown 22286 1726882801.59754: calling self._execute() 22286 1726882801.60339: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882801.60343: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882801.60346: variable 'omit' from source: magic vars 22286 1726882801.61123: variable 'ansible_distribution_major_version' from source: facts 22286 1726882801.61145: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882801.61459: variable 'nm_profile_exists' from source: set_fact 22286 1726882801.61617: Evaluated conditional (nm_profile_exists.rc == 0): True 22286 1726882801.61632: variable 'omit' from source: magic vars 22286 1726882801.61702: variable 'omit' from source: magic vars 22286 1726882801.61884: variable 'omit' from source: magic vars 22286 1726882801.61936: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22286 1726882801.61991: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22286 1726882801.62169: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22286 1726882801.62183: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882801.62203: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882801.62246: variable 'inventory_hostname' from source: host vars for 'managed_node3' 22286 1726882801.62287: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882801.62348: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882801.62549: Set connection var ansible_shell_executable to /bin/sh 22286 1726882801.62840: Set connection var ansible_module_compression to ZIP_DEFLATED 22286 1726882801.62843: Set connection var ansible_connection to ssh 22286 1726882801.62845: Set connection var ansible_shell_type to sh 22286 1726882801.62848: Set connection var ansible_timeout to 10 22286 1726882801.62850: Set connection var ansible_pipelining to False 22286 1726882801.62852: variable 'ansible_shell_executable' from source: unknown 22286 1726882801.62854: variable 'ansible_connection' from source: unknown 22286 1726882801.62856: variable 'ansible_module_compression' from source: unknown 22286 1726882801.62858: variable 'ansible_shell_type' from source: unknown 22286 1726882801.62860: variable 'ansible_shell_executable' from source: unknown 22286 1726882801.62862: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882801.62864: variable 'ansible_pipelining' from source: unknown 22286 1726882801.62866: variable 'ansible_timeout' from source: unknown 22286 1726882801.62868: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882801.63312: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22286 1726882801.63316: variable 'omit' from source: magic vars 22286 1726882801.63318: starting attempt loop 22286 1726882801.63321: running the handler 22286 1726882801.63323: handler run complete 22286 1726882801.63325: attempt loop complete, returning result 22286 1726882801.63327: _execute() done 22286 1726882801.63329: dumping result to json 22286 1726882801.63331: done dumping result, returning 22286 1726882801.63440: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0affe814-3a2d-a75d-4836-0000000004b4] 22286 1726882801.63444: sending task result for task 0affe814-3a2d-a75d-4836-0000000004b4 ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 22286 1726882801.63626: no more pending results, returning what we have 22286 1726882801.63630: results queue empty 22286 1726882801.63631: checking for any_errors_fatal 22286 1726882801.63643: done checking for any_errors_fatal 22286 1726882801.63644: checking for max_fail_percentage 22286 1726882801.63647: done checking for max_fail_percentage 22286 1726882801.63648: checking to see if all hosts have failed and the running result is not ok 22286 1726882801.63649: done checking to see if all hosts have failed 22286 1726882801.63650: getting the remaining hosts for this loop 22286 1726882801.63652: done getting the remaining hosts for this loop 22286 1726882801.63658: getting the next task for host managed_node3 22286 1726882801.63670: done getting next task for host managed_node3 22286 1726882801.63673: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 22286 1726882801.63679: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882801.63683: getting variables 22286 1726882801.63686: in VariableManager get_vars() 22286 1726882801.63732: Calling all_inventory to load vars for managed_node3 22286 1726882801.64238: Calling groups_inventory to load vars for managed_node3 22286 1726882801.64243: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882801.64250: done sending task result for task 0affe814-3a2d-a75d-4836-0000000004b4 22286 1726882801.64254: WORKER PROCESS EXITING 22286 1726882801.64298: Calling all_plugins_play to load vars for managed_node3 22286 1726882801.64303: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882801.64307: Calling groups_plugins_play to load vars for managed_node3 22286 1726882801.67116: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882801.71231: done with get_vars() 22286 1726882801.71681: done getting variables 22286 1726882801.71755: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 22286 1726882801.72305: variable 'profile' from source: include params 22286 1726882801.72310: variable 'interface' from source: play vars 22286 1726882801.72394: variable 'interface' from source: play vars TASK [Get the ansible_managed comment in ifcfg-veth0] ************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 21:40:01 -0400 (0:00:00.153) 0:00:25.117 ****** 22286 1726882801.72940: entering _queue_task() for managed_node3/command 22286 1726882801.73915: worker is 1 (out of 1 available) 22286 1726882801.73929: exiting _queue_task() for managed_node3/command 22286 1726882801.74144: done queuing things up, now waiting for results queue to drain 22286 1726882801.74147: waiting for pending results... 22286 1726882801.74471: running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-veth0 22286 1726882801.74921: in run() - task 0affe814-3a2d-a75d-4836-0000000004b6 22286 1726882801.75179: variable 'ansible_search_path' from source: unknown 22286 1726882801.75183: variable 'ansible_search_path' from source: unknown 22286 1726882801.75186: calling self._execute() 22286 1726882801.75310: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882801.75366: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882801.75574: variable 'omit' from source: magic vars 22286 1726882801.76401: variable 'ansible_distribution_major_version' from source: facts 22286 1726882801.76472: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882801.76821: variable 'profile_stat' from source: set_fact 22286 1726882801.76847: Evaluated conditional (profile_stat.stat.exists): False 22286 1726882801.76902: when evaluation is False, skipping this task 22286 1726882801.76910: _execute() done 22286 1726882801.76918: dumping result to json 22286 1726882801.76925: done dumping result, returning 22286 1726882801.77113: done running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-veth0 [0affe814-3a2d-a75d-4836-0000000004b6] 22286 1726882801.77117: sending task result for task 0affe814-3a2d-a75d-4836-0000000004b6 22286 1726882801.77222: done sending task result for task 0affe814-3a2d-a75d-4836-0000000004b6 22286 1726882801.77227: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 22286 1726882801.77293: no more pending results, returning what we have 22286 1726882801.77298: results queue empty 22286 1726882801.77299: checking for any_errors_fatal 22286 1726882801.77306: done checking for any_errors_fatal 22286 1726882801.77307: checking for max_fail_percentage 22286 1726882801.77310: done checking for max_fail_percentage 22286 1726882801.77311: checking to see if all hosts have failed and the running result is not ok 22286 1726882801.77312: done checking to see if all hosts have failed 22286 1726882801.77313: getting the remaining hosts for this loop 22286 1726882801.77315: done getting the remaining hosts for this loop 22286 1726882801.77320: getting the next task for host managed_node3 22286 1726882801.77329: done getting next task for host managed_node3 22286 1726882801.77332: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 22286 1726882801.77441: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882801.77448: getting variables 22286 1726882801.77450: in VariableManager get_vars() 22286 1726882801.77498: Calling all_inventory to load vars for managed_node3 22286 1726882801.77502: Calling groups_inventory to load vars for managed_node3 22286 1726882801.77505: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882801.77518: Calling all_plugins_play to load vars for managed_node3 22286 1726882801.77523: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882801.77527: Calling groups_plugins_play to load vars for managed_node3 22286 1726882801.82644: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882801.90310: done with get_vars() 22286 1726882801.90362: done getting variables 22286 1726882801.90669: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 22286 1726882801.91026: variable 'profile' from source: include params 22286 1726882801.91031: variable 'interface' from source: play vars 22286 1726882801.91390: variable 'interface' from source: play vars TASK [Verify the ansible_managed comment in ifcfg-veth0] *********************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 21:40:01 -0400 (0:00:00.189) 0:00:25.307 ****** 22286 1726882801.91443: entering _queue_task() for managed_node3/set_fact 22286 1726882801.92503: worker is 1 (out of 1 available) 22286 1726882801.92517: exiting _queue_task() for managed_node3/set_fact 22286 1726882801.92531: done queuing things up, now waiting for results queue to drain 22286 1726882801.92533: waiting for pending results... 22286 1726882801.93157: running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-veth0 22286 1726882801.93673: in run() - task 0affe814-3a2d-a75d-4836-0000000004b7 22286 1726882801.93691: variable 'ansible_search_path' from source: unknown 22286 1726882801.93696: variable 'ansible_search_path' from source: unknown 22286 1726882801.93739: calling self._execute() 22286 1726882801.94254: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882801.94263: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882801.94268: variable 'omit' from source: magic vars 22286 1726882801.95092: variable 'ansible_distribution_major_version' from source: facts 22286 1726882801.95154: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882801.95460: variable 'profile_stat' from source: set_fact 22286 1726882801.95464: Evaluated conditional (profile_stat.stat.exists): False 22286 1726882801.95466: when evaluation is False, skipping this task 22286 1726882801.95469: _execute() done 22286 1726882801.95472: dumping result to json 22286 1726882801.95474: done dumping result, returning 22286 1726882801.95479: done running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-veth0 [0affe814-3a2d-a75d-4836-0000000004b7] 22286 1726882801.95482: sending task result for task 0affe814-3a2d-a75d-4836-0000000004b7 22286 1726882801.95727: done sending task result for task 0affe814-3a2d-a75d-4836-0000000004b7 22286 1726882801.95731: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 22286 1726882801.95791: no more pending results, returning what we have 22286 1726882801.95796: results queue empty 22286 1726882801.95797: checking for any_errors_fatal 22286 1726882801.95804: done checking for any_errors_fatal 22286 1726882801.95805: checking for max_fail_percentage 22286 1726882801.95808: done checking for max_fail_percentage 22286 1726882801.95809: checking to see if all hosts have failed and the running result is not ok 22286 1726882801.95810: done checking to see if all hosts have failed 22286 1726882801.95811: getting the remaining hosts for this loop 22286 1726882801.95813: done getting the remaining hosts for this loop 22286 1726882801.95818: getting the next task for host managed_node3 22286 1726882801.95826: done getting next task for host managed_node3 22286 1726882801.95830: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 22286 1726882801.95838: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882801.95843: getting variables 22286 1726882801.95845: in VariableManager get_vars() 22286 1726882801.95895: Calling all_inventory to load vars for managed_node3 22286 1726882801.95898: Calling groups_inventory to load vars for managed_node3 22286 1726882801.95900: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882801.95912: Calling all_plugins_play to load vars for managed_node3 22286 1726882801.95916: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882801.95919: Calling groups_plugins_play to load vars for managed_node3 22286 1726882802.13587: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882802.20227: done with get_vars() 22286 1726882802.20274: done getting variables 22286 1726882802.20540: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 22286 1726882802.20664: variable 'profile' from source: include params 22286 1726882802.20668: variable 'interface' from source: play vars 22286 1726882802.20946: variable 'interface' from source: play vars TASK [Get the fingerprint comment in ifcfg-veth0] ****************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 21:40:02 -0400 (0:00:00.295) 0:00:25.603 ****** 22286 1726882802.20980: entering _queue_task() for managed_node3/command 22286 1726882802.21774: worker is 1 (out of 1 available) 22286 1726882802.21785: exiting _queue_task() for managed_node3/command 22286 1726882802.21796: done queuing things up, now waiting for results queue to drain 22286 1726882802.21799: waiting for pending results... 22286 1726882802.22179: running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-veth0 22286 1726882802.22741: in run() - task 0affe814-3a2d-a75d-4836-0000000004b8 22286 1726882802.22746: variable 'ansible_search_path' from source: unknown 22286 1726882802.22749: variable 'ansible_search_path' from source: unknown 22286 1726882802.22767: calling self._execute() 22286 1726882802.23028: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882802.23063: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882802.23266: variable 'omit' from source: magic vars 22286 1726882802.24239: variable 'ansible_distribution_major_version' from source: facts 22286 1726882802.24243: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882802.24549: variable 'profile_stat' from source: set_fact 22286 1726882802.24572: Evaluated conditional (profile_stat.stat.exists): False 22286 1726882802.24593: when evaluation is False, skipping this task 22286 1726882802.24604: _execute() done 22286 1726882802.24739: dumping result to json 22286 1726882802.24743: done dumping result, returning 22286 1726882802.24747: done running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-veth0 [0affe814-3a2d-a75d-4836-0000000004b8] 22286 1726882802.24750: sending task result for task 0affe814-3a2d-a75d-4836-0000000004b8 22286 1726882802.24886: done sending task result for task 0affe814-3a2d-a75d-4836-0000000004b8 22286 1726882802.24891: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 22286 1726882802.24954: no more pending results, returning what we have 22286 1726882802.24960: results queue empty 22286 1726882802.24961: checking for any_errors_fatal 22286 1726882802.24971: done checking for any_errors_fatal 22286 1726882802.24972: checking for max_fail_percentage 22286 1726882802.24975: done checking for max_fail_percentage 22286 1726882802.24976: checking to see if all hosts have failed and the running result is not ok 22286 1726882802.24977: done checking to see if all hosts have failed 22286 1726882802.24978: getting the remaining hosts for this loop 22286 1726882802.24980: done getting the remaining hosts for this loop 22286 1726882802.24985: getting the next task for host managed_node3 22286 1726882802.24994: done getting next task for host managed_node3 22286 1726882802.24997: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 22286 1726882802.25003: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882802.25009: getting variables 22286 1726882802.25010: in VariableManager get_vars() 22286 1726882802.25060: Calling all_inventory to load vars for managed_node3 22286 1726882802.25063: Calling groups_inventory to load vars for managed_node3 22286 1726882802.25066: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882802.25080: Calling all_plugins_play to load vars for managed_node3 22286 1726882802.25083: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882802.25086: Calling groups_plugins_play to load vars for managed_node3 22286 1726882802.28822: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882802.32373: done with get_vars() 22286 1726882802.32408: done getting variables 22286 1726882802.32486: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 22286 1726882802.32618: variable 'profile' from source: include params 22286 1726882802.32622: variable 'interface' from source: play vars 22286 1726882802.32696: variable 'interface' from source: play vars TASK [Verify the fingerprint comment in ifcfg-veth0] *************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 21:40:02 -0400 (0:00:00.117) 0:00:25.720 ****** 22286 1726882802.32731: entering _queue_task() for managed_node3/set_fact 22286 1726882802.33078: worker is 1 (out of 1 available) 22286 1726882802.33092: exiting _queue_task() for managed_node3/set_fact 22286 1726882802.33107: done queuing things up, now waiting for results queue to drain 22286 1726882802.33109: waiting for pending results... 22286 1726882802.33641: running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-veth0 22286 1726882802.33686: in run() - task 0affe814-3a2d-a75d-4836-0000000004b9 22286 1726882802.33706: variable 'ansible_search_path' from source: unknown 22286 1726882802.33740: variable 'ansible_search_path' from source: unknown 22286 1726882802.33752: calling self._execute() 22286 1726882802.33943: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882802.33948: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882802.33952: variable 'omit' from source: magic vars 22286 1726882802.34327: variable 'ansible_distribution_major_version' from source: facts 22286 1726882802.34341: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882802.34500: variable 'profile_stat' from source: set_fact 22286 1726882802.34515: Evaluated conditional (profile_stat.stat.exists): False 22286 1726882802.34519: when evaluation is False, skipping this task 22286 1726882802.34523: _execute() done 22286 1726882802.34525: dumping result to json 22286 1726882802.34532: done dumping result, returning 22286 1726882802.34541: done running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-veth0 [0affe814-3a2d-a75d-4836-0000000004b9] 22286 1726882802.34588: sending task result for task 0affe814-3a2d-a75d-4836-0000000004b9 22286 1726882802.34671: done sending task result for task 0affe814-3a2d-a75d-4836-0000000004b9 22286 1726882802.34677: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 22286 1726882802.34736: no more pending results, returning what we have 22286 1726882802.34741: results queue empty 22286 1726882802.34743: checking for any_errors_fatal 22286 1726882802.34749: done checking for any_errors_fatal 22286 1726882802.34750: checking for max_fail_percentage 22286 1726882802.34753: done checking for max_fail_percentage 22286 1726882802.34754: checking to see if all hosts have failed and the running result is not ok 22286 1726882802.34755: done checking to see if all hosts have failed 22286 1726882802.34757: getting the remaining hosts for this loop 22286 1726882802.34759: done getting the remaining hosts for this loop 22286 1726882802.34764: getting the next task for host managed_node3 22286 1726882802.34773: done getting next task for host managed_node3 22286 1726882802.34777: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 22286 1726882802.34782: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882802.34787: getting variables 22286 1726882802.34788: in VariableManager get_vars() 22286 1726882802.35037: Calling all_inventory to load vars for managed_node3 22286 1726882802.35041: Calling groups_inventory to load vars for managed_node3 22286 1726882802.35045: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882802.35056: Calling all_plugins_play to load vars for managed_node3 22286 1726882802.35060: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882802.35064: Calling groups_plugins_play to load vars for managed_node3 22286 1726882802.37202: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882802.40906: done with get_vars() 22286 1726882802.40949: done getting variables 22286 1726882802.41026: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 22286 1726882802.41186: variable 'profile' from source: include params 22286 1726882802.41190: variable 'interface' from source: play vars 22286 1726882802.41265: variable 'interface' from source: play vars TASK [Assert that the profile is present - 'veth0'] **************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 21:40:02 -0400 (0:00:00.085) 0:00:25.806 ****** 22286 1726882802.41301: entering _queue_task() for managed_node3/assert 22286 1726882802.41652: worker is 1 (out of 1 available) 22286 1726882802.41666: exiting _queue_task() for managed_node3/assert 22286 1726882802.41679: done queuing things up, now waiting for results queue to drain 22286 1726882802.41681: waiting for pending results... 22286 1726882802.42057: running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'veth0' 22286 1726882802.42199: in run() - task 0affe814-3a2d-a75d-4836-0000000003b9 22286 1726882802.42211: variable 'ansible_search_path' from source: unknown 22286 1726882802.42215: variable 'ansible_search_path' from source: unknown 22286 1726882802.42218: calling self._execute() 22286 1726882802.42279: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882802.42290: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882802.42303: variable 'omit' from source: magic vars 22286 1726882802.42796: variable 'ansible_distribution_major_version' from source: facts 22286 1726882802.42816: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882802.42879: variable 'omit' from source: magic vars 22286 1726882802.43041: variable 'omit' from source: magic vars 22286 1726882802.43346: variable 'profile' from source: include params 22286 1726882802.43350: variable 'interface' from source: play vars 22286 1726882802.43352: variable 'interface' from source: play vars 22286 1726882802.43355: variable 'omit' from source: magic vars 22286 1726882802.43357: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22286 1726882802.43360: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22286 1726882802.43363: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22286 1726882802.43365: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882802.43373: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882802.43408: variable 'inventory_hostname' from source: host vars for 'managed_node3' 22286 1726882802.43411: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882802.43416: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882802.43672: Set connection var ansible_shell_executable to /bin/sh 22286 1726882802.43677: Set connection var ansible_module_compression to ZIP_DEFLATED 22286 1726882802.43680: Set connection var ansible_connection to ssh 22286 1726882802.43683: Set connection var ansible_shell_type to sh 22286 1726882802.43686: Set connection var ansible_timeout to 10 22286 1726882802.43688: Set connection var ansible_pipelining to False 22286 1726882802.43691: variable 'ansible_shell_executable' from source: unknown 22286 1726882802.43694: variable 'ansible_connection' from source: unknown 22286 1726882802.43696: variable 'ansible_module_compression' from source: unknown 22286 1726882802.43698: variable 'ansible_shell_type' from source: unknown 22286 1726882802.43701: variable 'ansible_shell_executable' from source: unknown 22286 1726882802.43703: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882802.43705: variable 'ansible_pipelining' from source: unknown 22286 1726882802.43708: variable 'ansible_timeout' from source: unknown 22286 1726882802.43710: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882802.43828: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22286 1726882802.43843: variable 'omit' from source: magic vars 22286 1726882802.43849: starting attempt loop 22286 1726882802.43853: running the handler 22286 1726882802.43996: variable 'lsr_net_profile_exists' from source: set_fact 22286 1726882802.44003: Evaluated conditional (lsr_net_profile_exists): True 22286 1726882802.44011: handler run complete 22286 1726882802.44031: attempt loop complete, returning result 22286 1726882802.44036: _execute() done 22286 1726882802.44039: dumping result to json 22286 1726882802.44045: done dumping result, returning 22286 1726882802.44053: done running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'veth0' [0affe814-3a2d-a75d-4836-0000000003b9] 22286 1726882802.44060: sending task result for task 0affe814-3a2d-a75d-4836-0000000003b9 22286 1726882802.44181: done sending task result for task 0affe814-3a2d-a75d-4836-0000000003b9 22286 1726882802.44185: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 22286 1726882802.44250: no more pending results, returning what we have 22286 1726882802.44254: results queue empty 22286 1726882802.44255: checking for any_errors_fatal 22286 1726882802.44263: done checking for any_errors_fatal 22286 1726882802.44264: checking for max_fail_percentage 22286 1726882802.44267: done checking for max_fail_percentage 22286 1726882802.44274: checking to see if all hosts have failed and the running result is not ok 22286 1726882802.44277: done checking to see if all hosts have failed 22286 1726882802.44278: getting the remaining hosts for this loop 22286 1726882802.44281: done getting the remaining hosts for this loop 22286 1726882802.44286: getting the next task for host managed_node3 22286 1726882802.44294: done getting next task for host managed_node3 22286 1726882802.44297: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 22286 1726882802.44301: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882802.44306: getting variables 22286 1726882802.44308: in VariableManager get_vars() 22286 1726882802.44356: Calling all_inventory to load vars for managed_node3 22286 1726882802.44359: Calling groups_inventory to load vars for managed_node3 22286 1726882802.44363: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882802.44378: Calling all_plugins_play to load vars for managed_node3 22286 1726882802.44382: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882802.44386: Calling groups_plugins_play to load vars for managed_node3 22286 1726882802.46992: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882802.50593: done with get_vars() 22286 1726882802.50632: done getting variables 22286 1726882802.50703: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 22286 1726882802.50847: variable 'profile' from source: include params 22286 1726882802.50851: variable 'interface' from source: play vars 22286 1726882802.50925: variable 'interface' from source: play vars TASK [Assert that the ansible managed comment is present in 'veth0'] *********** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 21:40:02 -0400 (0:00:00.096) 0:00:25.903 ****** 22286 1726882802.50972: entering _queue_task() for managed_node3/assert 22286 1726882802.51459: worker is 1 (out of 1 available) 22286 1726882802.51504: exiting _queue_task() for managed_node3/assert 22286 1726882802.51517: done queuing things up, now waiting for results queue to drain 22286 1726882802.51519: waiting for pending results... 22286 1726882802.51719: running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'veth0' 22286 1726882802.51886: in run() - task 0affe814-3a2d-a75d-4836-0000000003ba 22286 1726882802.51890: variable 'ansible_search_path' from source: unknown 22286 1726882802.51893: variable 'ansible_search_path' from source: unknown 22286 1726882802.51895: calling self._execute() 22286 1726882802.52148: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882802.52153: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882802.52156: variable 'omit' from source: magic vars 22286 1726882802.52496: variable 'ansible_distribution_major_version' from source: facts 22286 1726882802.52516: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882802.52523: variable 'omit' from source: magic vars 22286 1726882802.52740: variable 'omit' from source: magic vars 22286 1726882802.52745: variable 'profile' from source: include params 22286 1726882802.52748: variable 'interface' from source: play vars 22286 1726882802.52803: variable 'interface' from source: play vars 22286 1726882802.52825: variable 'omit' from source: magic vars 22286 1726882802.52878: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22286 1726882802.52929: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22286 1726882802.52957: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22286 1726882802.52978: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882802.53000: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882802.53035: variable 'inventory_hostname' from source: host vars for 'managed_node3' 22286 1726882802.53039: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882802.53044: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882802.53252: Set connection var ansible_shell_executable to /bin/sh 22286 1726882802.53256: Set connection var ansible_module_compression to ZIP_DEFLATED 22286 1726882802.53259: Set connection var ansible_connection to ssh 22286 1726882802.53261: Set connection var ansible_shell_type to sh 22286 1726882802.53276: Set connection var ansible_timeout to 10 22286 1726882802.53280: Set connection var ansible_pipelining to False 22286 1726882802.53284: variable 'ansible_shell_executable' from source: unknown 22286 1726882802.53288: variable 'ansible_connection' from source: unknown 22286 1726882802.53291: variable 'ansible_module_compression' from source: unknown 22286 1726882802.53294: variable 'ansible_shell_type' from source: unknown 22286 1726882802.53297: variable 'ansible_shell_executable' from source: unknown 22286 1726882802.53300: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882802.53303: variable 'ansible_pipelining' from source: unknown 22286 1726882802.53307: variable 'ansible_timeout' from source: unknown 22286 1726882802.53323: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882802.53516: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22286 1726882802.53529: variable 'omit' from source: magic vars 22286 1726882802.53541: starting attempt loop 22286 1726882802.53544: running the handler 22286 1726882802.53696: variable 'lsr_net_profile_ansible_managed' from source: set_fact 22286 1726882802.53702: Evaluated conditional (lsr_net_profile_ansible_managed): True 22286 1726882802.53710: handler run complete 22286 1726882802.53728: attempt loop complete, returning result 22286 1726882802.53731: _execute() done 22286 1726882802.53737: dumping result to json 22286 1726882802.53742: done dumping result, returning 22286 1726882802.53751: done running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'veth0' [0affe814-3a2d-a75d-4836-0000000003ba] 22286 1726882802.53763: sending task result for task 0affe814-3a2d-a75d-4836-0000000003ba 22286 1726882802.53962: done sending task result for task 0affe814-3a2d-a75d-4836-0000000003ba 22286 1726882802.53965: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 22286 1726882802.54026: no more pending results, returning what we have 22286 1726882802.54030: results queue empty 22286 1726882802.54031: checking for any_errors_fatal 22286 1726882802.54038: done checking for any_errors_fatal 22286 1726882802.54040: checking for max_fail_percentage 22286 1726882802.54042: done checking for max_fail_percentage 22286 1726882802.54043: checking to see if all hosts have failed and the running result is not ok 22286 1726882802.54044: done checking to see if all hosts have failed 22286 1726882802.54045: getting the remaining hosts for this loop 22286 1726882802.54047: done getting the remaining hosts for this loop 22286 1726882802.54052: getting the next task for host managed_node3 22286 1726882802.54059: done getting next task for host managed_node3 22286 1726882802.54062: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 22286 1726882802.54065: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882802.54070: getting variables 22286 1726882802.54071: in VariableManager get_vars() 22286 1726882802.54233: Calling all_inventory to load vars for managed_node3 22286 1726882802.54238: Calling groups_inventory to load vars for managed_node3 22286 1726882802.54242: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882802.54253: Calling all_plugins_play to load vars for managed_node3 22286 1726882802.54256: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882802.54261: Calling groups_plugins_play to load vars for managed_node3 22286 1726882802.56565: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882802.59790: done with get_vars() 22286 1726882802.59826: done getting variables 22286 1726882802.59914: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 22286 1726882802.60061: variable 'profile' from source: include params 22286 1726882802.60066: variable 'interface' from source: play vars 22286 1726882802.60164: variable 'interface' from source: play vars TASK [Assert that the fingerprint comment is present in veth0] ***************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 21:40:02 -0400 (0:00:00.092) 0:00:25.995 ****** 22286 1726882802.60229: entering _queue_task() for managed_node3/assert 22286 1726882802.60761: worker is 1 (out of 1 available) 22286 1726882802.60775: exiting _queue_task() for managed_node3/assert 22286 1726882802.60787: done queuing things up, now waiting for results queue to drain 22286 1726882802.60789: waiting for pending results... 22286 1726882802.60977: running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in veth0 22286 1726882802.61088: in run() - task 0affe814-3a2d-a75d-4836-0000000003bb 22286 1726882802.61093: variable 'ansible_search_path' from source: unknown 22286 1726882802.61096: variable 'ansible_search_path' from source: unknown 22286 1726882802.61130: calling self._execute() 22286 1726882802.61261: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882802.61265: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882802.61269: variable 'omit' from source: magic vars 22286 1726882802.61942: variable 'ansible_distribution_major_version' from source: facts 22286 1726882802.61946: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882802.61949: variable 'omit' from source: magic vars 22286 1726882802.61952: variable 'omit' from source: magic vars 22286 1726882802.61954: variable 'profile' from source: include params 22286 1726882802.61957: variable 'interface' from source: play vars 22286 1726882802.62000: variable 'interface' from source: play vars 22286 1726882802.62035: variable 'omit' from source: magic vars 22286 1726882802.62073: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22286 1726882802.62121: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22286 1726882802.62150: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22286 1726882802.62167: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882802.62181: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882802.62217: variable 'inventory_hostname' from source: host vars for 'managed_node3' 22286 1726882802.62227: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882802.62230: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882802.62363: Set connection var ansible_shell_executable to /bin/sh 22286 1726882802.62373: Set connection var ansible_module_compression to ZIP_DEFLATED 22286 1726882802.62384: Set connection var ansible_connection to ssh 22286 1726882802.62432: Set connection var ansible_shell_type to sh 22286 1726882802.62447: Set connection var ansible_timeout to 10 22286 1726882802.62458: Set connection var ansible_pipelining to False 22286 1726882802.62491: variable 'ansible_shell_executable' from source: unknown 22286 1726882802.62494: variable 'ansible_connection' from source: unknown 22286 1726882802.62497: variable 'ansible_module_compression' from source: unknown 22286 1726882802.62500: variable 'ansible_shell_type' from source: unknown 22286 1726882802.62503: variable 'ansible_shell_executable' from source: unknown 22286 1726882802.62505: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882802.62510: variable 'ansible_pipelining' from source: unknown 22286 1726882802.62513: variable 'ansible_timeout' from source: unknown 22286 1726882802.62520: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882802.62822: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22286 1726882802.62826: variable 'omit' from source: magic vars 22286 1726882802.62828: starting attempt loop 22286 1726882802.62831: running the handler 22286 1726882802.62888: variable 'lsr_net_profile_fingerprint' from source: set_fact 22286 1726882802.62901: Evaluated conditional (lsr_net_profile_fingerprint): True 22286 1726882802.62913: handler run complete 22286 1726882802.62944: attempt loop complete, returning result 22286 1726882802.62952: _execute() done 22286 1726882802.62960: dumping result to json 22286 1726882802.62968: done dumping result, returning 22286 1726882802.62983: done running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in veth0 [0affe814-3a2d-a75d-4836-0000000003bb] 22286 1726882802.62996: sending task result for task 0affe814-3a2d-a75d-4836-0000000003bb ok: [managed_node3] => { "changed": false } MSG: All assertions passed 22286 1726882802.63200: no more pending results, returning what we have 22286 1726882802.63205: results queue empty 22286 1726882802.63206: checking for any_errors_fatal 22286 1726882802.63212: done checking for any_errors_fatal 22286 1726882802.63213: checking for max_fail_percentage 22286 1726882802.63215: done checking for max_fail_percentage 22286 1726882802.63217: checking to see if all hosts have failed and the running result is not ok 22286 1726882802.63218: done checking to see if all hosts have failed 22286 1726882802.63219: getting the remaining hosts for this loop 22286 1726882802.63220: done getting the remaining hosts for this loop 22286 1726882802.63225: getting the next task for host managed_node3 22286 1726882802.63237: done getting next task for host managed_node3 22286 1726882802.63240: ^ task is: TASK: Get ip address information 22286 1726882802.63243: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882802.63247: getting variables 22286 1726882802.63249: in VariableManager get_vars() 22286 1726882802.63295: Calling all_inventory to load vars for managed_node3 22286 1726882802.63298: Calling groups_inventory to load vars for managed_node3 22286 1726882802.63301: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882802.63315: Calling all_plugins_play to load vars for managed_node3 22286 1726882802.63318: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882802.63322: Calling groups_plugins_play to load vars for managed_node3 22286 1726882802.63841: done sending task result for task 0affe814-3a2d-a75d-4836-0000000003bb 22286 1726882802.63845: WORKER PROCESS EXITING 22286 1726882802.67641: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882802.71416: done with get_vars() 22286 1726882802.71491: done getting variables 22286 1726882802.71680: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get ip address information] ********************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:53 Friday 20 September 2024 21:40:02 -0400 (0:00:00.115) 0:00:26.111 ****** 22286 1726882802.71765: entering _queue_task() for managed_node3/command 22286 1726882802.72091: worker is 1 (out of 1 available) 22286 1726882802.72105: exiting _queue_task() for managed_node3/command 22286 1726882802.72120: done queuing things up, now waiting for results queue to drain 22286 1726882802.72122: waiting for pending results... 22286 1726882802.72337: running TaskExecutor() for managed_node3/TASK: Get ip address information 22286 1726882802.72408: in run() - task 0affe814-3a2d-a75d-4836-00000000005e 22286 1726882802.72423: variable 'ansible_search_path' from source: unknown 22286 1726882802.72463: calling self._execute() 22286 1726882802.72569: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882802.72574: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882802.72589: variable 'omit' from source: magic vars 22286 1726882802.72963: variable 'ansible_distribution_major_version' from source: facts 22286 1726882802.72983: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882802.72998: variable 'omit' from source: magic vars 22286 1726882802.73014: variable 'omit' from source: magic vars 22286 1726882802.73126: variable 'interface' from source: play vars 22286 1726882802.73148: variable 'omit' from source: magic vars 22286 1726882802.73191: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22286 1726882802.73221: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22286 1726882802.73242: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22286 1726882802.73278: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882802.73289: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882802.73314: variable 'inventory_hostname' from source: host vars for 'managed_node3' 22286 1726882802.73317: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882802.73322: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882802.73415: Set connection var ansible_shell_executable to /bin/sh 22286 1726882802.73423: Set connection var ansible_module_compression to ZIP_DEFLATED 22286 1726882802.73426: Set connection var ansible_connection to ssh 22286 1726882802.73429: Set connection var ansible_shell_type to sh 22286 1726882802.73437: Set connection var ansible_timeout to 10 22286 1726882802.73445: Set connection var ansible_pipelining to False 22286 1726882802.73471: variable 'ansible_shell_executable' from source: unknown 22286 1726882802.73475: variable 'ansible_connection' from source: unknown 22286 1726882802.73481: variable 'ansible_module_compression' from source: unknown 22286 1726882802.73483: variable 'ansible_shell_type' from source: unknown 22286 1726882802.73487: variable 'ansible_shell_executable' from source: unknown 22286 1726882802.73490: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882802.73492: variable 'ansible_pipelining' from source: unknown 22286 1726882802.73495: variable 'ansible_timeout' from source: unknown 22286 1726882802.73497: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882802.73655: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22286 1726882802.73685: variable 'omit' from source: magic vars 22286 1726882802.73689: starting attempt loop 22286 1726882802.73692: running the handler 22286 1726882802.73728: _low_level_execute_command(): starting 22286 1726882802.73732: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22286 1726882802.74343: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882802.74347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882802.74402: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882802.74405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882802.74454: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882802.74570: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882802.76448: stdout chunk (state=3): >>>/root <<< 22286 1726882802.76534: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882802.76590: stderr chunk (state=3): >>><<< 22286 1726882802.76592: stdout chunk (state=3): >>><<< 22286 1726882802.76608: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882802.76632: _low_level_execute_command(): starting 22286 1726882802.76637: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882802.76616-23206-9168473150608 `" && echo ansible-tmp-1726882802.76616-23206-9168473150608="` echo /root/.ansible/tmp/ansible-tmp-1726882802.76616-23206-9168473150608 `" ) && sleep 0' 22286 1726882802.77277: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882802.77281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882802.77287: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882802.77299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882802.77323: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882802.77430: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882802.79659: stdout chunk (state=3): >>>ansible-tmp-1726882802.76616-23206-9168473150608=/root/.ansible/tmp/ansible-tmp-1726882802.76616-23206-9168473150608 <<< 22286 1726882802.79867: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882802.79870: stdout chunk (state=3): >>><<< 22286 1726882802.79873: stderr chunk (state=3): >>><<< 22286 1726882802.80056: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882802.76616-23206-9168473150608=/root/.ansible/tmp/ansible-tmp-1726882802.76616-23206-9168473150608 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882802.80059: variable 'ansible_module_compression' from source: unknown 22286 1726882802.80062: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22286nvkga28q/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 22286 1726882802.80064: variable 'ansible_facts' from source: unknown 22286 1726882802.80194: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882802.76616-23206-9168473150608/AnsiballZ_command.py 22286 1726882802.80358: Sending initial data 22286 1726882802.80418: Sent initial data (152 bytes) 22286 1726882802.80996: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882802.81050: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882802.81059: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22286 1726882802.81068: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 <<< 22286 1726882802.81094: stderr chunk (state=3): >>>debug2: match found <<< 22286 1726882802.81215: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882802.81221: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882802.81352: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882802.83065: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 22286 1726882802.83125: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22286 1726882802.83246: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22286 1726882802.83340: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22286nvkga28q/tmpj62q7aww /root/.ansible/tmp/ansible-tmp-1726882802.76616-23206-9168473150608/AnsiballZ_command.py <<< 22286 1726882802.83344: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882802.76616-23206-9168473150608/AnsiballZ_command.py" <<< 22286 1726882802.83463: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-22286nvkga28q/tmpj62q7aww" to remote "/root/.ansible/tmp/ansible-tmp-1726882802.76616-23206-9168473150608/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882802.76616-23206-9168473150608/AnsiballZ_command.py" <<< 22286 1726882802.85416: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882802.85664: stderr chunk (state=3): >>><<< 22286 1726882802.85667: stdout chunk (state=3): >>><<< 22286 1726882802.85669: done transferring module to remote 22286 1726882802.85671: _low_level_execute_command(): starting 22286 1726882802.85674: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882802.76616-23206-9168473150608/ /root/.ansible/tmp/ansible-tmp-1726882802.76616-23206-9168473150608/AnsiballZ_command.py && sleep 0' 22286 1726882802.86196: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882802.86205: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882802.86254: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882802.86257: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22286 1726882802.86260: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 <<< 22286 1726882802.86263: stderr chunk (state=3): >>>debug2: match not found <<< 22286 1726882802.86313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882802.86317: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 22286 1726882802.86320: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.41.238 is address <<< 22286 1726882802.86322: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 22286 1726882802.86325: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882802.86327: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882802.86345: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22286 1726882802.86362: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 <<< 22286 1726882802.86366: stderr chunk (state=3): >>>debug2: match found <<< 22286 1726882802.86368: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882802.86471: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882802.86474: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882802.86477: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882802.86623: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882802.88767: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882802.88791: stderr chunk (state=3): >>><<< 22286 1726882802.88794: stdout chunk (state=3): >>><<< 22286 1726882802.88916: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882802.88920: _low_level_execute_command(): starting 22286 1726882802.88922: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882802.76616-23206-9168473150608/AnsiballZ_command.py && sleep 0' 22286 1726882802.89660: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882802.89663: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882802.89666: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882802.89775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882802.89898: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882802.89961: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882802.90052: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882802.90211: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882803.08129: stdout chunk (state=3): >>> {"changed": true, "stdout": "31: veth0@if30: mtu 1500 qdisc noqueue state UP group default qlen 1000\n link/ether 12:47:77:58:de:1c brd ff:ff:ff:ff:ff:ff link-netns ns1\n inet6 2001:db8::2/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 2001:db8::3/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 2001:db8::4/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 fe80::bb27:5193:b0bd:d95c/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "addr", "show", "veth0"], "start": "2024-09-20 21:40:03.075092", "end": "2024-09-20 21:40:03.079143", "delta": "0:00:00.004051", "msg": "", "invocation": {"module_args": {"_raw_params": "ip addr show veth0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 22286 1726882803.09787: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. <<< 22286 1726882803.09833: stderr chunk (state=3): >>><<< 22286 1726882803.09840: stdout chunk (state=3): >>><<< 22286 1726882803.09859: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "31: veth0@if30: mtu 1500 qdisc noqueue state UP group default qlen 1000\n link/ether 12:47:77:58:de:1c brd ff:ff:ff:ff:ff:ff link-netns ns1\n inet6 2001:db8::2/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 2001:db8::3/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 2001:db8::4/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 fe80::bb27:5193:b0bd:d95c/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "addr", "show", "veth0"], "start": "2024-09-20 21:40:03.075092", "end": "2024-09-20 21:40:03.079143", "delta": "0:00:00.004051", "msg": "", "invocation": {"module_args": {"_raw_params": "ip addr show veth0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. 22286 1726882803.09904: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip addr show veth0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882802.76616-23206-9168473150608/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22286 1726882803.09914: _low_level_execute_command(): starting 22286 1726882803.09920: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882802.76616-23206-9168473150608/ > /dev/null 2>&1 && sleep 0' 22286 1726882803.10513: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882803.10674: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882803.10796: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882803.12898: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882803.12925: stderr chunk (state=3): >>><<< 22286 1726882803.12928: stdout chunk (state=3): >>><<< 22286 1726882803.13140: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882803.13144: handler run complete 22286 1726882803.13146: Evaluated conditional (False): False 22286 1726882803.13149: attempt loop complete, returning result 22286 1726882803.13151: _execute() done 22286 1726882803.13153: dumping result to json 22286 1726882803.13155: done dumping result, returning 22286 1726882803.13157: done running TaskExecutor() for managed_node3/TASK: Get ip address information [0affe814-3a2d-a75d-4836-00000000005e] 22286 1726882803.13159: sending task result for task 0affe814-3a2d-a75d-4836-00000000005e 22286 1726882803.13255: done sending task result for task 0affe814-3a2d-a75d-4836-00000000005e 22286 1726882803.13259: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ip", "addr", "show", "veth0" ], "delta": "0:00:00.004051", "end": "2024-09-20 21:40:03.079143", "rc": 0, "start": "2024-09-20 21:40:03.075092" } STDOUT: 31: veth0@if30: mtu 1500 qdisc noqueue state UP group default qlen 1000 link/ether 12:47:77:58:de:1c brd ff:ff:ff:ff:ff:ff link-netns ns1 inet6 2001:db8::2/32 scope global noprefixroute valid_lft forever preferred_lft forever inet6 2001:db8::3/32 scope global noprefixroute valid_lft forever preferred_lft forever inet6 2001:db8::4/32 scope global noprefixroute valid_lft forever preferred_lft forever inet6 fe80::bb27:5193:b0bd:d95c/64 scope link noprefixroute valid_lft forever preferred_lft forever 22286 1726882803.13389: no more pending results, returning what we have 22286 1726882803.13394: results queue empty 22286 1726882803.13395: checking for any_errors_fatal 22286 1726882803.13404: done checking for any_errors_fatal 22286 1726882803.13405: checking for max_fail_percentage 22286 1726882803.13408: done checking for max_fail_percentage 22286 1726882803.13409: checking to see if all hosts have failed and the running result is not ok 22286 1726882803.13410: done checking to see if all hosts have failed 22286 1726882803.13411: getting the remaining hosts for this loop 22286 1726882803.13413: done getting the remaining hosts for this loop 22286 1726882803.13419: getting the next task for host managed_node3 22286 1726882803.13426: done getting next task for host managed_node3 22286 1726882803.13430: ^ task is: TASK: Show ip_addr 22286 1726882803.13433: ^ state is: HOST STATE: block=3, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882803.13586: getting variables 22286 1726882803.13588: in VariableManager get_vars() 22286 1726882803.13644: Calling all_inventory to load vars for managed_node3 22286 1726882803.13648: Calling groups_inventory to load vars for managed_node3 22286 1726882803.13652: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882803.13666: Calling all_plugins_play to load vars for managed_node3 22286 1726882803.13671: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882803.13675: Calling groups_plugins_play to load vars for managed_node3 22286 1726882803.15546: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882803.17142: done with get_vars() 22286 1726882803.17164: done getting variables 22286 1726882803.17218: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show ip_addr] ************************************************************ task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:57 Friday 20 September 2024 21:40:03 -0400 (0:00:00.454) 0:00:26.565 ****** 22286 1726882803.17243: entering _queue_task() for managed_node3/debug 22286 1726882803.17490: worker is 1 (out of 1 available) 22286 1726882803.17503: exiting _queue_task() for managed_node3/debug 22286 1726882803.17514: done queuing things up, now waiting for results queue to drain 22286 1726882803.17516: waiting for pending results... 22286 1726882803.17708: running TaskExecutor() for managed_node3/TASK: Show ip_addr 22286 1726882803.17778: in run() - task 0affe814-3a2d-a75d-4836-00000000005f 22286 1726882803.17791: variable 'ansible_search_path' from source: unknown 22286 1726882803.17823: calling self._execute() 22286 1726882803.17911: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882803.17918: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882803.17928: variable 'omit' from source: magic vars 22286 1726882803.18253: variable 'ansible_distribution_major_version' from source: facts 22286 1726882803.18263: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882803.18270: variable 'omit' from source: magic vars 22286 1726882803.18290: variable 'omit' from source: magic vars 22286 1726882803.18322: variable 'omit' from source: magic vars 22286 1726882803.18359: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22286 1726882803.18393: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22286 1726882803.18414: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22286 1726882803.18430: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882803.18443: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882803.18470: variable 'inventory_hostname' from source: host vars for 'managed_node3' 22286 1726882803.18473: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882803.18480: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882803.18569: Set connection var ansible_shell_executable to /bin/sh 22286 1726882803.18581: Set connection var ansible_module_compression to ZIP_DEFLATED 22286 1726882803.18584: Set connection var ansible_connection to ssh 22286 1726882803.18586: Set connection var ansible_shell_type to sh 22286 1726882803.18591: Set connection var ansible_timeout to 10 22286 1726882803.18600: Set connection var ansible_pipelining to False 22286 1726882803.18621: variable 'ansible_shell_executable' from source: unknown 22286 1726882803.18626: variable 'ansible_connection' from source: unknown 22286 1726882803.18629: variable 'ansible_module_compression' from source: unknown 22286 1726882803.18632: variable 'ansible_shell_type' from source: unknown 22286 1726882803.18636: variable 'ansible_shell_executable' from source: unknown 22286 1726882803.18644: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882803.18647: variable 'ansible_pipelining' from source: unknown 22286 1726882803.18649: variable 'ansible_timeout' from source: unknown 22286 1726882803.18654: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882803.18778: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22286 1726882803.18787: variable 'omit' from source: magic vars 22286 1726882803.18792: starting attempt loop 22286 1726882803.18796: running the handler 22286 1726882803.18903: variable 'ip_addr' from source: set_fact 22286 1726882803.18918: handler run complete 22286 1726882803.18933: attempt loop complete, returning result 22286 1726882803.18938: _execute() done 22286 1726882803.18941: dumping result to json 22286 1726882803.18951: done dumping result, returning 22286 1726882803.18954: done running TaskExecutor() for managed_node3/TASK: Show ip_addr [0affe814-3a2d-a75d-4836-00000000005f] 22286 1726882803.18961: sending task result for task 0affe814-3a2d-a75d-4836-00000000005f 22286 1726882803.19050: done sending task result for task 0affe814-3a2d-a75d-4836-00000000005f ok: [managed_node3] => { "ip_addr.stdout": "31: veth0@if30: mtu 1500 qdisc noqueue state UP group default qlen 1000\n link/ether 12:47:77:58:de:1c brd ff:ff:ff:ff:ff:ff link-netns ns1\n inet6 2001:db8::2/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 2001:db8::3/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 2001:db8::4/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 fe80::bb27:5193:b0bd:d95c/64 scope link noprefixroute \n valid_lft forever preferred_lft forever" } 22286 1726882803.19114: no more pending results, returning what we have 22286 1726882803.19118: results queue empty 22286 1726882803.19119: checking for any_errors_fatal 22286 1726882803.19128: done checking for any_errors_fatal 22286 1726882803.19129: checking for max_fail_percentage 22286 1726882803.19132: done checking for max_fail_percentage 22286 1726882803.19133: checking to see if all hosts have failed and the running result is not ok 22286 1726882803.19137: done checking to see if all hosts have failed 22286 1726882803.19137: getting the remaining hosts for this loop 22286 1726882803.19139: done getting the remaining hosts for this loop 22286 1726882803.19143: getting the next task for host managed_node3 22286 1726882803.19150: done getting next task for host managed_node3 22286 1726882803.19154: ^ task is: TASK: Assert ipv6 addresses are correctly set 22286 1726882803.19156: ^ state is: HOST STATE: block=3, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882803.19159: getting variables 22286 1726882803.19171: in VariableManager get_vars() 22286 1726882803.19211: Calling all_inventory to load vars for managed_node3 22286 1726882803.19214: Calling groups_inventory to load vars for managed_node3 22286 1726882803.19217: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882803.19223: WORKER PROCESS EXITING 22286 1726882803.19232: Calling all_plugins_play to load vars for managed_node3 22286 1726882803.19237: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882803.19241: Calling groups_plugins_play to load vars for managed_node3 22286 1726882803.20444: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882803.22114: done with get_vars() 22286 1726882803.22137: done getting variables 22286 1726882803.22184: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert ipv6 addresses are correctly set] ********************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:60 Friday 20 September 2024 21:40:03 -0400 (0:00:00.049) 0:00:26.615 ****** 22286 1726882803.22204: entering _queue_task() for managed_node3/assert 22286 1726882803.22426: worker is 1 (out of 1 available) 22286 1726882803.22442: exiting _queue_task() for managed_node3/assert 22286 1726882803.22457: done queuing things up, now waiting for results queue to drain 22286 1726882803.22459: waiting for pending results... 22286 1726882803.22653: running TaskExecutor() for managed_node3/TASK: Assert ipv6 addresses are correctly set 22286 1726882803.22725: in run() - task 0affe814-3a2d-a75d-4836-000000000060 22286 1726882803.22741: variable 'ansible_search_path' from source: unknown 22286 1726882803.22774: calling self._execute() 22286 1726882803.22862: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882803.22868: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882803.22882: variable 'omit' from source: magic vars 22286 1726882803.23202: variable 'ansible_distribution_major_version' from source: facts 22286 1726882803.23212: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882803.23220: variable 'omit' from source: magic vars 22286 1726882803.23243: variable 'omit' from source: magic vars 22286 1726882803.23274: variable 'omit' from source: magic vars 22286 1726882803.23312: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22286 1726882803.23348: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22286 1726882803.23366: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22286 1726882803.23385: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882803.23396: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882803.23422: variable 'inventory_hostname' from source: host vars for 'managed_node3' 22286 1726882803.23426: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882803.23430: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882803.23521: Set connection var ansible_shell_executable to /bin/sh 22286 1726882803.23529: Set connection var ansible_module_compression to ZIP_DEFLATED 22286 1726882803.23532: Set connection var ansible_connection to ssh 22286 1726882803.23536: Set connection var ansible_shell_type to sh 22286 1726882803.23543: Set connection var ansible_timeout to 10 22286 1726882803.23552: Set connection var ansible_pipelining to False 22286 1726882803.23576: variable 'ansible_shell_executable' from source: unknown 22286 1726882803.23582: variable 'ansible_connection' from source: unknown 22286 1726882803.23584: variable 'ansible_module_compression' from source: unknown 22286 1726882803.23589: variable 'ansible_shell_type' from source: unknown 22286 1726882803.23592: variable 'ansible_shell_executable' from source: unknown 22286 1726882803.23597: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882803.23602: variable 'ansible_pipelining' from source: unknown 22286 1726882803.23605: variable 'ansible_timeout' from source: unknown 22286 1726882803.23610: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882803.23736: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22286 1726882803.23746: variable 'omit' from source: magic vars 22286 1726882803.23752: starting attempt loop 22286 1726882803.23756: running the handler 22286 1726882803.23876: variable 'ip_addr' from source: set_fact 22286 1726882803.23893: Evaluated conditional ('inet6 2001:db8::2/32' in ip_addr.stdout): True 22286 1726882803.23999: variable 'ip_addr' from source: set_fact 22286 1726882803.24005: Evaluated conditional ('inet6 2001:db8::3/32' in ip_addr.stdout): True 22286 1726882803.24109: variable 'ip_addr' from source: set_fact 22286 1726882803.24122: Evaluated conditional ('inet6 2001:db8::4/32' in ip_addr.stdout): True 22286 1726882803.24125: handler run complete 22286 1726882803.24140: attempt loop complete, returning result 22286 1726882803.24143: _execute() done 22286 1726882803.24146: dumping result to json 22286 1726882803.24151: done dumping result, returning 22286 1726882803.24159: done running TaskExecutor() for managed_node3/TASK: Assert ipv6 addresses are correctly set [0affe814-3a2d-a75d-4836-000000000060] 22286 1726882803.24165: sending task result for task 0affe814-3a2d-a75d-4836-000000000060 22286 1726882803.24255: done sending task result for task 0affe814-3a2d-a75d-4836-000000000060 22286 1726882803.24258: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 22286 1726882803.24312: no more pending results, returning what we have 22286 1726882803.24316: results queue empty 22286 1726882803.24317: checking for any_errors_fatal 22286 1726882803.24322: done checking for any_errors_fatal 22286 1726882803.24323: checking for max_fail_percentage 22286 1726882803.24325: done checking for max_fail_percentage 22286 1726882803.24326: checking to see if all hosts have failed and the running result is not ok 22286 1726882803.24327: done checking to see if all hosts have failed 22286 1726882803.24328: getting the remaining hosts for this loop 22286 1726882803.24330: done getting the remaining hosts for this loop 22286 1726882803.24336: getting the next task for host managed_node3 22286 1726882803.24342: done getting next task for host managed_node3 22286 1726882803.24345: ^ task is: TASK: Get ipv6 routes 22286 1726882803.24347: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882803.24350: getting variables 22286 1726882803.24351: in VariableManager get_vars() 22286 1726882803.24391: Calling all_inventory to load vars for managed_node3 22286 1726882803.24395: Calling groups_inventory to load vars for managed_node3 22286 1726882803.24397: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882803.24407: Calling all_plugins_play to load vars for managed_node3 22286 1726882803.24410: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882803.24413: Calling groups_plugins_play to load vars for managed_node3 22286 1726882803.25621: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882803.27197: done with get_vars() 22286 1726882803.27219: done getting variables 22286 1726882803.27267: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get ipv6 routes] ********************************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:69 Friday 20 September 2024 21:40:03 -0400 (0:00:00.050) 0:00:26.666 ****** 22286 1726882803.27294: entering _queue_task() for managed_node3/command 22286 1726882803.27516: worker is 1 (out of 1 available) 22286 1726882803.27528: exiting _queue_task() for managed_node3/command 22286 1726882803.27542: done queuing things up, now waiting for results queue to drain 22286 1726882803.27544: waiting for pending results... 22286 1726882803.27743: running TaskExecutor() for managed_node3/TASK: Get ipv6 routes 22286 1726882803.27811: in run() - task 0affe814-3a2d-a75d-4836-000000000061 22286 1726882803.27823: variable 'ansible_search_path' from source: unknown 22286 1726882803.27857: calling self._execute() 22286 1726882803.27946: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882803.27954: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882803.27964: variable 'omit' from source: magic vars 22286 1726882803.28284: variable 'ansible_distribution_major_version' from source: facts 22286 1726882803.28294: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882803.28300: variable 'omit' from source: magic vars 22286 1726882803.28321: variable 'omit' from source: magic vars 22286 1726882803.28354: variable 'omit' from source: magic vars 22286 1726882803.28391: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22286 1726882803.28425: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22286 1726882803.28444: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22286 1726882803.28460: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882803.28471: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882803.28501: variable 'inventory_hostname' from source: host vars for 'managed_node3' 22286 1726882803.28505: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882803.28508: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882803.28599: Set connection var ansible_shell_executable to /bin/sh 22286 1726882803.28608: Set connection var ansible_module_compression to ZIP_DEFLATED 22286 1726882803.28611: Set connection var ansible_connection to ssh 22286 1726882803.28614: Set connection var ansible_shell_type to sh 22286 1726882803.28621: Set connection var ansible_timeout to 10 22286 1726882803.28629: Set connection var ansible_pipelining to False 22286 1726882803.28654: variable 'ansible_shell_executable' from source: unknown 22286 1726882803.28658: variable 'ansible_connection' from source: unknown 22286 1726882803.28661: variable 'ansible_module_compression' from source: unknown 22286 1726882803.28664: variable 'ansible_shell_type' from source: unknown 22286 1726882803.28668: variable 'ansible_shell_executable' from source: unknown 22286 1726882803.28672: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882803.28679: variable 'ansible_pipelining' from source: unknown 22286 1726882803.28684: variable 'ansible_timeout' from source: unknown 22286 1726882803.28689: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882803.28811: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22286 1726882803.28820: variable 'omit' from source: magic vars 22286 1726882803.28826: starting attempt loop 22286 1726882803.28829: running the handler 22286 1726882803.28847: _low_level_execute_command(): starting 22286 1726882803.28857: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22286 1726882803.29418: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882803.29423: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration <<< 22286 1726882803.29427: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882803.29484: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882803.29487: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882803.29489: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882803.29608: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882803.31486: stdout chunk (state=3): >>>/root <<< 22286 1726882803.31599: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882803.31650: stderr chunk (state=3): >>><<< 22286 1726882803.31653: stdout chunk (state=3): >>><<< 22286 1726882803.31683: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882803.31693: _low_level_execute_command(): starting 22286 1726882803.31701: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882803.3167946-23230-186160960912543 `" && echo ansible-tmp-1726882803.3167946-23230-186160960912543="` echo /root/.ansible/tmp/ansible-tmp-1726882803.3167946-23230-186160960912543 `" ) && sleep 0' 22286 1726882803.32190: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882803.32193: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found <<< 22286 1726882803.32197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration <<< 22286 1726882803.32206: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882803.32259: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882803.32266: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882803.32380: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882803.34520: stdout chunk (state=3): >>>ansible-tmp-1726882803.3167946-23230-186160960912543=/root/.ansible/tmp/ansible-tmp-1726882803.3167946-23230-186160960912543 <<< 22286 1726882803.34749: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882803.34753: stdout chunk (state=3): >>><<< 22286 1726882803.34756: stderr chunk (state=3): >>><<< 22286 1726882803.34759: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882803.3167946-23230-186160960912543=/root/.ansible/tmp/ansible-tmp-1726882803.3167946-23230-186160960912543 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882803.34762: variable 'ansible_module_compression' from source: unknown 22286 1726882803.34813: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22286nvkga28q/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 22286 1726882803.34858: variable 'ansible_facts' from source: unknown 22286 1726882803.34947: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882803.3167946-23230-186160960912543/AnsiballZ_command.py 22286 1726882803.35189: Sending initial data 22286 1726882803.35193: Sent initial data (156 bytes) 22286 1726882803.35615: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882803.35645: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882803.35690: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882803.35707: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882803.35820: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882803.37562: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 22286 1726882803.37574: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22286 1726882803.37673: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22286 1726882803.37791: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22286nvkga28q/tmpzep58bhq /root/.ansible/tmp/ansible-tmp-1726882803.3167946-23230-186160960912543/AnsiballZ_command.py <<< 22286 1726882803.37794: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882803.3167946-23230-186160960912543/AnsiballZ_command.py" <<< 22286 1726882803.37898: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 22286 1726882803.37905: stderr chunk (state=3): >>>debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-22286nvkga28q/tmpzep58bhq" to remote "/root/.ansible/tmp/ansible-tmp-1726882803.3167946-23230-186160960912543/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882803.3167946-23230-186160960912543/AnsiballZ_command.py" <<< 22286 1726882803.39643: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882803.39707: stderr chunk (state=3): >>><<< 22286 1726882803.39711: stdout chunk (state=3): >>><<< 22286 1726882803.39730: done transferring module to remote 22286 1726882803.39743: _low_level_execute_command(): starting 22286 1726882803.39748: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882803.3167946-23230-186160960912543/ /root/.ansible/tmp/ansible-tmp-1726882803.3167946-23230-186160960912543/AnsiballZ_command.py && sleep 0' 22286 1726882803.40398: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882803.40501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882803.40565: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882803.40584: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882803.40643: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882803.40779: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882803.42921: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882803.42933: stderr chunk (state=3): >>><<< 22286 1726882803.42947: stdout chunk (state=3): >>><<< 22286 1726882803.43067: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882803.43071: _low_level_execute_command(): starting 22286 1726882803.43073: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882803.3167946-23230-186160960912543/AnsiballZ_command.py && sleep 0' 22286 1726882803.44155: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 22286 1726882803.44175: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882803.44294: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882803.44444: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882803.62480: stdout chunk (state=3): >>> {"changed": true, "stdout": "2001:db8::/32 dev veth0 proto kernel metric 101 pref medium\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nfe80::/64 dev veth0 proto kernel metric 1024 pref medium\ndefault via 2001:db8::1 dev veth0 proto static metric 101 pref medium", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "route"], "start": "2024-09-20 21:40:03.618941", "end": "2024-09-20 21:40:03.622773", "delta": "0:00:00.003832", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 22286 1726882803.64308: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. <<< 22286 1726882803.64357: stderr chunk (state=3): >>><<< 22286 1726882803.64359: stdout chunk (state=3): >>><<< 22286 1726882803.64374: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "2001:db8::/32 dev veth0 proto kernel metric 101 pref medium\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nfe80::/64 dev veth0 proto kernel metric 1024 pref medium\ndefault via 2001:db8::1 dev veth0 proto static metric 101 pref medium", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "route"], "start": "2024-09-20 21:40:03.618941", "end": "2024-09-20 21:40:03.622773", "delta": "0:00:00.003832", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. 22286 1726882803.64462: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -6 route', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882803.3167946-23230-186160960912543/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22286 1726882803.64467: _low_level_execute_command(): starting 22286 1726882803.64469: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882803.3167946-23230-186160960912543/ > /dev/null 2>&1 && sleep 0' 22286 1726882803.65057: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882803.65126: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882803.65145: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882803.65177: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882803.65395: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882803.67385: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882803.67431: stderr chunk (state=3): >>><<< 22286 1726882803.67436: stdout chunk (state=3): >>><<< 22286 1726882803.67454: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882803.67461: handler run complete 22286 1726882803.67486: Evaluated conditional (False): False 22286 1726882803.67496: attempt loop complete, returning result 22286 1726882803.67499: _execute() done 22286 1726882803.67504: dumping result to json 22286 1726882803.67510: done dumping result, returning 22286 1726882803.67518: done running TaskExecutor() for managed_node3/TASK: Get ipv6 routes [0affe814-3a2d-a75d-4836-000000000061] 22286 1726882803.67524: sending task result for task 0affe814-3a2d-a75d-4836-000000000061 22286 1726882803.67637: done sending task result for task 0affe814-3a2d-a75d-4836-000000000061 22286 1726882803.67640: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ip", "-6", "route" ], "delta": "0:00:00.003832", "end": "2024-09-20 21:40:03.622773", "rc": 0, "start": "2024-09-20 21:40:03.618941" } STDOUT: 2001:db8::/32 dev veth0 proto kernel metric 101 pref medium fe80::/64 dev eth0 proto kernel metric 1024 pref medium fe80::/64 dev veth0 proto kernel metric 1024 pref medium default via 2001:db8::1 dev veth0 proto static metric 101 pref medium 22286 1726882803.67742: no more pending results, returning what we have 22286 1726882803.67746: results queue empty 22286 1726882803.67747: checking for any_errors_fatal 22286 1726882803.67756: done checking for any_errors_fatal 22286 1726882803.67757: checking for max_fail_percentage 22286 1726882803.67759: done checking for max_fail_percentage 22286 1726882803.67761: checking to see if all hosts have failed and the running result is not ok 22286 1726882803.67762: done checking to see if all hosts have failed 22286 1726882803.67762: getting the remaining hosts for this loop 22286 1726882803.67764: done getting the remaining hosts for this loop 22286 1726882803.67769: getting the next task for host managed_node3 22286 1726882803.67778: done getting next task for host managed_node3 22286 1726882803.67781: ^ task is: TASK: Show ipv6_route 22286 1726882803.67783: ^ state is: HOST STATE: block=3, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882803.67787: getting variables 22286 1726882803.67788: in VariableManager get_vars() 22286 1726882803.67829: Calling all_inventory to load vars for managed_node3 22286 1726882803.67832: Calling groups_inventory to load vars for managed_node3 22286 1726882803.67845: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882803.67856: Calling all_plugins_play to load vars for managed_node3 22286 1726882803.67860: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882803.67863: Calling groups_plugins_play to load vars for managed_node3 22286 1726882803.70441: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882803.72052: done with get_vars() 22286 1726882803.72074: done getting variables 22286 1726882803.72124: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show ipv6_route] ********************************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:73 Friday 20 September 2024 21:40:03 -0400 (0:00:00.448) 0:00:27.114 ****** 22286 1726882803.72152: entering _queue_task() for managed_node3/debug 22286 1726882803.72390: worker is 1 (out of 1 available) 22286 1726882803.72403: exiting _queue_task() for managed_node3/debug 22286 1726882803.72416: done queuing things up, now waiting for results queue to drain 22286 1726882803.72418: waiting for pending results... 22286 1726882803.72601: running TaskExecutor() for managed_node3/TASK: Show ipv6_route 22286 1726882803.72666: in run() - task 0affe814-3a2d-a75d-4836-000000000062 22286 1726882803.72683: variable 'ansible_search_path' from source: unknown 22286 1726882803.72714: calling self._execute() 22286 1726882803.72807: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882803.72812: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882803.72824: variable 'omit' from source: magic vars 22286 1726882803.73340: variable 'ansible_distribution_major_version' from source: facts 22286 1726882803.73345: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882803.73348: variable 'omit' from source: magic vars 22286 1726882803.73350: variable 'omit' from source: magic vars 22286 1726882803.73352: variable 'omit' from source: magic vars 22286 1726882803.73365: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22286 1726882803.73405: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22286 1726882803.73427: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22286 1726882803.73450: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882803.73465: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882803.73585: variable 'inventory_hostname' from source: host vars for 'managed_node3' 22286 1726882803.73590: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882803.73593: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882803.73644: Set connection var ansible_shell_executable to /bin/sh 22286 1726882803.73656: Set connection var ansible_module_compression to ZIP_DEFLATED 22286 1726882803.73659: Set connection var ansible_connection to ssh 22286 1726882803.73661: Set connection var ansible_shell_type to sh 22286 1726882803.73693: Set connection var ansible_timeout to 10 22286 1726882803.73697: Set connection var ansible_pipelining to False 22286 1726882803.73711: variable 'ansible_shell_executable' from source: unknown 22286 1726882803.73714: variable 'ansible_connection' from source: unknown 22286 1726882803.73716: variable 'ansible_module_compression' from source: unknown 22286 1726882803.73722: variable 'ansible_shell_type' from source: unknown 22286 1726882803.73724: variable 'ansible_shell_executable' from source: unknown 22286 1726882803.73729: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882803.73737: variable 'ansible_pipelining' from source: unknown 22286 1726882803.73739: variable 'ansible_timeout' from source: unknown 22286 1726882803.73745: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882803.73916: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22286 1726882803.74038: variable 'omit' from source: magic vars 22286 1726882803.74043: starting attempt loop 22286 1726882803.74046: running the handler 22286 1726882803.74086: variable 'ipv6_route' from source: set_fact 22286 1726882803.74104: handler run complete 22286 1726882803.74125: attempt loop complete, returning result 22286 1726882803.74129: _execute() done 22286 1726882803.74132: dumping result to json 22286 1726882803.74143: done dumping result, returning 22286 1726882803.74147: done running TaskExecutor() for managed_node3/TASK: Show ipv6_route [0affe814-3a2d-a75d-4836-000000000062] 22286 1726882803.74159: sending task result for task 0affe814-3a2d-a75d-4836-000000000062 22286 1726882803.74448: done sending task result for task 0affe814-3a2d-a75d-4836-000000000062 22286 1726882803.74451: WORKER PROCESS EXITING ok: [managed_node3] => { "ipv6_route.stdout": "2001:db8::/32 dev veth0 proto kernel metric 101 pref medium\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nfe80::/64 dev veth0 proto kernel metric 1024 pref medium\ndefault via 2001:db8::1 dev veth0 proto static metric 101 pref medium" } 22286 1726882803.74492: no more pending results, returning what we have 22286 1726882803.74495: results queue empty 22286 1726882803.74496: checking for any_errors_fatal 22286 1726882803.74503: done checking for any_errors_fatal 22286 1726882803.74504: checking for max_fail_percentage 22286 1726882803.74506: done checking for max_fail_percentage 22286 1726882803.74508: checking to see if all hosts have failed and the running result is not ok 22286 1726882803.74510: done checking to see if all hosts have failed 22286 1726882803.74511: getting the remaining hosts for this loop 22286 1726882803.74512: done getting the remaining hosts for this loop 22286 1726882803.74516: getting the next task for host managed_node3 22286 1726882803.74522: done getting next task for host managed_node3 22286 1726882803.74525: ^ task is: TASK: Assert default ipv6 route is set 22286 1726882803.74528: ^ state is: HOST STATE: block=3, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882803.74531: getting variables 22286 1726882803.74532: in VariableManager get_vars() 22286 1726882803.74582: Calling all_inventory to load vars for managed_node3 22286 1726882803.74586: Calling groups_inventory to load vars for managed_node3 22286 1726882803.74589: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882803.74600: Calling all_plugins_play to load vars for managed_node3 22286 1726882803.74603: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882803.74607: Calling groups_plugins_play to load vars for managed_node3 22286 1726882803.76570: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882803.78504: done with get_vars() 22286 1726882803.78526: done getting variables 22286 1726882803.78573: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert default ipv6 route is set] **************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:76 Friday 20 September 2024 21:40:03 -0400 (0:00:00.064) 0:00:27.179 ****** 22286 1726882803.78598: entering _queue_task() for managed_node3/assert 22286 1726882803.78827: worker is 1 (out of 1 available) 22286 1726882803.78840: exiting _queue_task() for managed_node3/assert 22286 1726882803.78853: done queuing things up, now waiting for results queue to drain 22286 1726882803.78855: waiting for pending results... 22286 1726882803.79032: running TaskExecutor() for managed_node3/TASK: Assert default ipv6 route is set 22286 1726882803.79100: in run() - task 0affe814-3a2d-a75d-4836-000000000063 22286 1726882803.79113: variable 'ansible_search_path' from source: unknown 22286 1726882803.79147: calling self._execute() 22286 1726882803.79232: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882803.79238: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882803.79249: variable 'omit' from source: magic vars 22286 1726882803.79566: variable 'ansible_distribution_major_version' from source: facts 22286 1726882803.79578: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882803.79582: variable 'omit' from source: magic vars 22286 1726882803.79600: variable 'omit' from source: magic vars 22286 1726882803.79631: variable 'omit' from source: magic vars 22286 1726882803.79671: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22286 1726882803.79703: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22286 1726882803.79733: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22286 1726882803.79754: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882803.79780: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882803.79840: variable 'inventory_hostname' from source: host vars for 'managed_node3' 22286 1726882803.79844: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882803.79848: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882803.79946: Set connection var ansible_shell_executable to /bin/sh 22286 1726882803.79959: Set connection var ansible_module_compression to ZIP_DEFLATED 22286 1726882803.79962: Set connection var ansible_connection to ssh 22286 1726882803.79965: Set connection var ansible_shell_type to sh 22286 1726882803.79981: Set connection var ansible_timeout to 10 22286 1726882803.79984: Set connection var ansible_pipelining to False 22286 1726882803.80052: variable 'ansible_shell_executable' from source: unknown 22286 1726882803.80056: variable 'ansible_connection' from source: unknown 22286 1726882803.80059: variable 'ansible_module_compression' from source: unknown 22286 1726882803.80062: variable 'ansible_shell_type' from source: unknown 22286 1726882803.80065: variable 'ansible_shell_executable' from source: unknown 22286 1726882803.80067: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882803.80070: variable 'ansible_pipelining' from source: unknown 22286 1726882803.80072: variable 'ansible_timeout' from source: unknown 22286 1726882803.80089: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882803.80301: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22286 1726882803.80304: variable 'omit' from source: magic vars 22286 1726882803.80309: starting attempt loop 22286 1726882803.80311: running the handler 22286 1726882803.80499: variable '__test_str' from source: task vars 22286 1726882803.80605: variable 'interface' from source: play vars 22286 1726882803.80621: variable 'ipv6_route' from source: set_fact 22286 1726882803.80681: Evaluated conditional (__test_str in ipv6_route.stdout): True 22286 1726882803.80759: handler run complete 22286 1726882803.80762: attempt loop complete, returning result 22286 1726882803.80765: _execute() done 22286 1726882803.80769: dumping result to json 22286 1726882803.80772: done dumping result, returning 22286 1726882803.80904: done running TaskExecutor() for managed_node3/TASK: Assert default ipv6 route is set [0affe814-3a2d-a75d-4836-000000000063] 22286 1726882803.80908: sending task result for task 0affe814-3a2d-a75d-4836-000000000063 22286 1726882803.80979: done sending task result for task 0affe814-3a2d-a75d-4836-000000000063 22286 1726882803.80982: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 22286 1726882803.81071: no more pending results, returning what we have 22286 1726882803.81074: results queue empty 22286 1726882803.81077: checking for any_errors_fatal 22286 1726882803.81082: done checking for any_errors_fatal 22286 1726882803.81083: checking for max_fail_percentage 22286 1726882803.81085: done checking for max_fail_percentage 22286 1726882803.81086: checking to see if all hosts have failed and the running result is not ok 22286 1726882803.81087: done checking to see if all hosts have failed 22286 1726882803.81088: getting the remaining hosts for this loop 22286 1726882803.81090: done getting the remaining hosts for this loop 22286 1726882803.81093: getting the next task for host managed_node3 22286 1726882803.81098: done getting next task for host managed_node3 22286 1726882803.81101: ^ task is: TASK: Ensure ping6 command is present 22286 1726882803.81103: ^ state is: HOST STATE: block=3, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882803.81106: getting variables 22286 1726882803.81107: in VariableManager get_vars() 22286 1726882803.81152: Calling all_inventory to load vars for managed_node3 22286 1726882803.81154: Calling groups_inventory to load vars for managed_node3 22286 1726882803.81156: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882803.81164: Calling all_plugins_play to load vars for managed_node3 22286 1726882803.81166: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882803.81168: Calling groups_plugins_play to load vars for managed_node3 22286 1726882803.82392: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882803.83978: done with get_vars() 22286 1726882803.83999: done getting variables 22286 1726882803.84047: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure ping6 command is present] ***************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:81 Friday 20 September 2024 21:40:03 -0400 (0:00:00.054) 0:00:27.234 ****** 22286 1726882803.84071: entering _queue_task() for managed_node3/package 22286 1726882803.84362: worker is 1 (out of 1 available) 22286 1726882803.84375: exiting _queue_task() for managed_node3/package 22286 1726882803.84389: done queuing things up, now waiting for results queue to drain 22286 1726882803.84391: waiting for pending results... 22286 1726882803.84689: running TaskExecutor() for managed_node3/TASK: Ensure ping6 command is present 22286 1726882803.84757: in run() - task 0affe814-3a2d-a75d-4836-000000000064 22286 1726882803.84762: variable 'ansible_search_path' from source: unknown 22286 1726882803.84816: calling self._execute() 22286 1726882803.84924: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882803.84937: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882803.84953: variable 'omit' from source: magic vars 22286 1726882803.85461: variable 'ansible_distribution_major_version' from source: facts 22286 1726882803.85471: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882803.85478: variable 'omit' from source: magic vars 22286 1726882803.85516: variable 'omit' from source: magic vars 22286 1726882803.85768: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22286 1726882803.88204: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22286 1726882803.88305: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22286 1726882803.88331: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22286 1726882803.88371: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22286 1726882803.88415: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22286 1726882803.88545: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22286 1726882803.88585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22286 1726882803.88614: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882803.88682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22286 1726882803.88689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22286 1726882803.88828: variable '__network_is_ostree' from source: set_fact 22286 1726882803.88832: variable 'omit' from source: magic vars 22286 1726882803.88847: variable 'omit' from source: magic vars 22286 1726882803.88874: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22286 1726882803.88898: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22286 1726882803.88914: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22286 1726882803.88934: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882803.88970: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882803.89014: variable 'inventory_hostname' from source: host vars for 'managed_node3' 22286 1726882803.89018: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882803.89021: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882803.89120: Set connection var ansible_shell_executable to /bin/sh 22286 1726882803.89124: Set connection var ansible_module_compression to ZIP_DEFLATED 22286 1726882803.89127: Set connection var ansible_connection to ssh 22286 1726882803.89129: Set connection var ansible_shell_type to sh 22286 1726882803.89137: Set connection var ansible_timeout to 10 22286 1726882803.89164: Set connection var ansible_pipelining to False 22286 1726882803.89198: variable 'ansible_shell_executable' from source: unknown 22286 1726882803.89202: variable 'ansible_connection' from source: unknown 22286 1726882803.89209: variable 'ansible_module_compression' from source: unknown 22286 1726882803.89212: variable 'ansible_shell_type' from source: unknown 22286 1726882803.89215: variable 'ansible_shell_executable' from source: unknown 22286 1726882803.89219: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882803.89223: variable 'ansible_pipelining' from source: unknown 22286 1726882803.89225: variable 'ansible_timeout' from source: unknown 22286 1726882803.89226: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882803.89365: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22286 1726882803.89375: variable 'omit' from source: magic vars 22286 1726882803.89380: starting attempt loop 22286 1726882803.89382: running the handler 22286 1726882803.89386: variable 'ansible_facts' from source: unknown 22286 1726882803.89389: variable 'ansible_facts' from source: unknown 22286 1726882803.89428: _low_level_execute_command(): starting 22286 1726882803.89442: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22286 1726882803.90135: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882803.90147: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882803.90220: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882803.90325: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882803.92181: stdout chunk (state=3): >>>/root <<< 22286 1726882803.92300: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882803.92408: stderr chunk (state=3): >>><<< 22286 1726882803.92412: stdout chunk (state=3): >>><<< 22286 1726882803.92433: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882803.92450: _low_level_execute_command(): starting 22286 1726882803.92455: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882803.9243429-23256-95552716793008 `" && echo ansible-tmp-1726882803.9243429-23256-95552716793008="` echo /root/.ansible/tmp/ansible-tmp-1726882803.9243429-23256-95552716793008 `" ) && sleep 0' 22286 1726882803.92956: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882803.92959: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882803.92962: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882803.92964: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882803.93033: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882803.93148: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882803.95293: stdout chunk (state=3): >>>ansible-tmp-1726882803.9243429-23256-95552716793008=/root/.ansible/tmp/ansible-tmp-1726882803.9243429-23256-95552716793008 <<< 22286 1726882803.95432: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882803.95445: stderr chunk (state=3): >>><<< 22286 1726882803.95448: stdout chunk (state=3): >>><<< 22286 1726882803.95464: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882803.9243429-23256-95552716793008=/root/.ansible/tmp/ansible-tmp-1726882803.9243429-23256-95552716793008 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882803.95490: variable 'ansible_module_compression' from source: unknown 22286 1726882803.95541: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22286nvkga28q/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 22286 1726882803.95578: variable 'ansible_facts' from source: unknown 22286 1726882803.95654: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882803.9243429-23256-95552716793008/AnsiballZ_dnf.py 22286 1726882803.95770: Sending initial data 22286 1726882803.95773: Sent initial data (151 bytes) 22286 1726882803.96222: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882803.96225: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882803.96229: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22286 1726882803.96231: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882803.96279: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882803.96284: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882803.96400: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882803.98127: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22286 1726882803.98323: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22286 1726882803.98456: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22286nvkga28q/tmph7fydout /root/.ansible/tmp/ansible-tmp-1726882803.9243429-23256-95552716793008/AnsiballZ_dnf.py <<< 22286 1726882803.98460: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882803.9243429-23256-95552716793008/AnsiballZ_dnf.py" <<< 22286 1726882803.98567: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-22286nvkga28q/tmph7fydout" to remote "/root/.ansible/tmp/ansible-tmp-1726882803.9243429-23256-95552716793008/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882803.9243429-23256-95552716793008/AnsiballZ_dnf.py" <<< 22286 1726882804.00816: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882804.01020: stderr chunk (state=3): >>><<< 22286 1726882804.01025: stdout chunk (state=3): >>><<< 22286 1726882804.01032: done transferring module to remote 22286 1726882804.01037: _low_level_execute_command(): starting 22286 1726882804.01039: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882803.9243429-23256-95552716793008/ /root/.ansible/tmp/ansible-tmp-1726882803.9243429-23256-95552716793008/AnsiballZ_dnf.py && sleep 0' 22286 1726882804.01657: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882804.01704: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882804.01874: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882804.04440: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882804.04443: stdout chunk (state=3): >>><<< 22286 1726882804.04445: stderr chunk (state=3): >>><<< 22286 1726882804.04448: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882804.04451: _low_level_execute_command(): starting 22286 1726882804.04454: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882803.9243429-23256-95552716793008/AnsiballZ_dnf.py && sleep 0' 22286 1726882804.05316: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882804.05329: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882804.05344: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882804.05530: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882804.05549: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882804.05863: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882805.57138: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iputils"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 22286 1726882805.62250: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. <<< 22286 1726882805.62255: stdout chunk (state=3): >>><<< 22286 1726882805.62265: stderr chunk (state=3): >>><<< 22286 1726882805.62294: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iputils"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. 22286 1726882805.62377: done with _execute_module (ansible.legacy.dnf, {'name': 'iputils', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882803.9243429-23256-95552716793008/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22286 1726882805.62398: _low_level_execute_command(): starting 22286 1726882805.62409: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882803.9243429-23256-95552716793008/ > /dev/null 2>&1 && sleep 0' 22286 1726882805.63153: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882805.63188: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882805.63201: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882805.63257: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882805.63419: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882805.63535: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882805.63650: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882805.65733: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882805.65764: stdout chunk (state=3): >>><<< 22286 1726882805.65767: stderr chunk (state=3): >>><<< 22286 1726882805.65785: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882805.65939: handler run complete 22286 1726882805.65942: attempt loop complete, returning result 22286 1726882805.65945: _execute() done 22286 1726882805.65947: dumping result to json 22286 1726882805.65949: done dumping result, returning 22286 1726882805.65951: done running TaskExecutor() for managed_node3/TASK: Ensure ping6 command is present [0affe814-3a2d-a75d-4836-000000000064] 22286 1726882805.65953: sending task result for task 0affe814-3a2d-a75d-4836-000000000064 22286 1726882805.66030: done sending task result for task 0affe814-3a2d-a75d-4836-000000000064 22286 1726882805.66036: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 22286 1726882805.66208: no more pending results, returning what we have 22286 1726882805.66212: results queue empty 22286 1726882805.66213: checking for any_errors_fatal 22286 1726882805.66221: done checking for any_errors_fatal 22286 1726882805.66222: checking for max_fail_percentage 22286 1726882805.66224: done checking for max_fail_percentage 22286 1726882805.66225: checking to see if all hosts have failed and the running result is not ok 22286 1726882805.66226: done checking to see if all hosts have failed 22286 1726882805.66227: getting the remaining hosts for this loop 22286 1726882805.66229: done getting the remaining hosts for this loop 22286 1726882805.66233: getting the next task for host managed_node3 22286 1726882805.66241: done getting next task for host managed_node3 22286 1726882805.66244: ^ task is: TASK: Test gateway can be pinged 22286 1726882805.66246: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882805.66250: getting variables 22286 1726882805.66251: in VariableManager get_vars() 22286 1726882805.66298: Calling all_inventory to load vars for managed_node3 22286 1726882805.66301: Calling groups_inventory to load vars for managed_node3 22286 1726882805.66304: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882805.66316: Calling all_plugins_play to load vars for managed_node3 22286 1726882805.66319: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882805.66322: Calling groups_plugins_play to load vars for managed_node3 22286 1726882805.70116: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882805.73911: done with get_vars() 22286 1726882805.73959: done getting variables 22286 1726882805.74032: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Test gateway can be pinged] ********************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:86 Friday 20 September 2024 21:40:05 -0400 (0:00:01.901) 0:00:29.135 ****** 22286 1726882805.74193: entering _queue_task() for managed_node3/command 22286 1726882805.74932: worker is 1 (out of 1 available) 22286 1726882805.75096: exiting _queue_task() for managed_node3/command 22286 1726882805.75111: done queuing things up, now waiting for results queue to drain 22286 1726882805.75112: waiting for pending results... 22286 1726882805.75553: running TaskExecutor() for managed_node3/TASK: Test gateway can be pinged 22286 1726882805.75895: in run() - task 0affe814-3a2d-a75d-4836-000000000065 22286 1726882805.75898: variable 'ansible_search_path' from source: unknown 22286 1726882805.75902: calling self._execute() 22286 1726882805.76130: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882805.76170: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882805.76195: variable 'omit' from source: magic vars 22286 1726882805.76869: variable 'ansible_distribution_major_version' from source: facts 22286 1726882805.76892: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882805.76904: variable 'omit' from source: magic vars 22286 1726882805.76942: variable 'omit' from source: magic vars 22286 1726882805.77052: variable 'omit' from source: magic vars 22286 1726882805.77058: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22286 1726882805.77110: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22286 1726882805.77143: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22286 1726882805.77182: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882805.77202: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882805.77244: variable 'inventory_hostname' from source: host vars for 'managed_node3' 22286 1726882805.77254: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882805.77268: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882805.77610: Set connection var ansible_shell_executable to /bin/sh 22286 1726882805.77613: Set connection var ansible_module_compression to ZIP_DEFLATED 22286 1726882805.77616: Set connection var ansible_connection to ssh 22286 1726882805.77619: Set connection var ansible_shell_type to sh 22286 1726882805.77621: Set connection var ansible_timeout to 10 22286 1726882805.77623: Set connection var ansible_pipelining to False 22286 1726882805.77816: variable 'ansible_shell_executable' from source: unknown 22286 1726882805.77822: variable 'ansible_connection' from source: unknown 22286 1726882805.77825: variable 'ansible_module_compression' from source: unknown 22286 1726882805.77827: variable 'ansible_shell_type' from source: unknown 22286 1726882805.77829: variable 'ansible_shell_executable' from source: unknown 22286 1726882805.77832: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882805.77841: variable 'ansible_pipelining' from source: unknown 22286 1726882805.77847: variable 'ansible_timeout' from source: unknown 22286 1726882805.77850: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882805.78347: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22286 1726882805.78354: variable 'omit' from source: magic vars 22286 1726882805.78357: starting attempt loop 22286 1726882805.78362: running the handler 22286 1726882805.78383: _low_level_execute_command(): starting 22286 1726882805.78408: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22286 1726882805.79287: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882805.79355: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882805.79439: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882805.79459: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882805.79490: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882805.79644: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882805.81551: stdout chunk (state=3): >>>/root <<< 22286 1726882805.81991: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882805.81995: stdout chunk (state=3): >>><<< 22286 1726882805.81997: stderr chunk (state=3): >>><<< 22286 1726882805.82000: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882805.82002: _low_level_execute_command(): starting 22286 1726882805.82005: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882805.8192909-23324-215841039631441 `" && echo ansible-tmp-1726882805.8192909-23324-215841039631441="` echo /root/.ansible/tmp/ansible-tmp-1726882805.8192909-23324-215841039631441 `" ) && sleep 0' 22286 1726882805.83185: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882805.83188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882805.83196: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882805.83205: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882805.83263: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882805.83275: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882805.83296: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882805.83454: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882805.85587: stdout chunk (state=3): >>>ansible-tmp-1726882805.8192909-23324-215841039631441=/root/.ansible/tmp/ansible-tmp-1726882805.8192909-23324-215841039631441 <<< 22286 1726882805.85845: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882805.85848: stdout chunk (state=3): >>><<< 22286 1726882805.85851: stderr chunk (state=3): >>><<< 22286 1726882805.85853: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882805.8192909-23324-215841039631441=/root/.ansible/tmp/ansible-tmp-1726882805.8192909-23324-215841039631441 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882805.85879: variable 'ansible_module_compression' from source: unknown 22286 1726882805.85941: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22286nvkga28q/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 22286 1726882805.85995: variable 'ansible_facts' from source: unknown 22286 1726882805.86140: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882805.8192909-23324-215841039631441/AnsiballZ_command.py 22286 1726882805.86324: Sending initial data 22286 1726882805.86327: Sent initial data (156 bytes) 22286 1726882805.86972: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882805.87051: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882805.87112: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882805.87236: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882805.88985: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22286 1726882805.89107: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22286 1726882805.89223: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22286nvkga28q/tmpzbhadap4 /root/.ansible/tmp/ansible-tmp-1726882805.8192909-23324-215841039631441/AnsiballZ_command.py <<< 22286 1726882805.89239: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882805.8192909-23324-215841039631441/AnsiballZ_command.py" <<< 22286 1726882805.89347: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-22286nvkga28q/tmpzbhadap4" to remote "/root/.ansible/tmp/ansible-tmp-1726882805.8192909-23324-215841039631441/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882805.8192909-23324-215841039631441/AnsiballZ_command.py" <<< 22286 1726882805.90943: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882805.90947: stdout chunk (state=3): >>><<< 22286 1726882805.90951: stderr chunk (state=3): >>><<< 22286 1726882805.90954: done transferring module to remote 22286 1726882805.90956: _low_level_execute_command(): starting 22286 1726882805.90958: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882805.8192909-23324-215841039631441/ /root/.ansible/tmp/ansible-tmp-1726882805.8192909-23324-215841039631441/AnsiballZ_command.py && sleep 0' 22286 1726882805.91644: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882805.91663: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 22286 1726882805.91765: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882805.91798: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882805.91943: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882805.94040: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882805.94049: stdout chunk (state=3): >>><<< 22286 1726882805.94058: stderr chunk (state=3): >>><<< 22286 1726882805.94162: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882805.94165: _low_level_execute_command(): starting 22286 1726882805.94169: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882805.8192909-23324-215841039631441/AnsiballZ_command.py && sleep 0' 22286 1726882805.94740: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882805.94757: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882805.94886: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found <<< 22286 1726882805.94909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882805.94928: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882805.95084: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882806.13237: stdout chunk (state=3): >>> {"changed": true, "stdout": "PING 2001:db8::1(2001:db8::1) 56 data bytes\n64 bytes from 2001:db8::1: icmp_seq=1 ttl=64 time=0.063 ms\n\n--- 2001:db8::1 ping statistics ---\n1 packets transmitted, 1 received, 0% packet loss, time 0ms\nrtt min/avg/max/mdev = 0.063/0.063/0.063/0.000 ms", "stderr": "", "rc": 0, "cmd": ["ping6", "-c1", "2001:db8::1"], "start": "2024-09-20 21:40:06.123831", "end": "2024-09-20 21:40:06.130159", "delta": "0:00:00.006328", "msg": "", "invocation": {"module_args": {"_raw_params": "ping6 -c1 2001:db8::1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 22286 1726882806.15041: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. <<< 22286 1726882806.15045: stderr chunk (state=3): >>><<< 22286 1726882806.15047: stdout chunk (state=3): >>><<< 22286 1726882806.15055: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "PING 2001:db8::1(2001:db8::1) 56 data bytes\n64 bytes from 2001:db8::1: icmp_seq=1 ttl=64 time=0.063 ms\n\n--- 2001:db8::1 ping statistics ---\n1 packets transmitted, 1 received, 0% packet loss, time 0ms\nrtt min/avg/max/mdev = 0.063/0.063/0.063/0.000 ms", "stderr": "", "rc": 0, "cmd": ["ping6", "-c1", "2001:db8::1"], "start": "2024-09-20 21:40:06.123831", "end": "2024-09-20 21:40:06.130159", "delta": "0:00:00.006328", "msg": "", "invocation": {"module_args": {"_raw_params": "ping6 -c1 2001:db8::1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. 22286 1726882806.15109: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ping6 -c1 2001:db8::1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882805.8192909-23324-215841039631441/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22286 1726882806.15125: _low_level_execute_command(): starting 22286 1726882806.15132: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882805.8192909-23324-215841039631441/ > /dev/null 2>&1 && sleep 0' 22286 1726882806.15772: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22286 1726882806.15781: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 22286 1726882806.15855: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882806.15886: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882806.15893: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882806.15922: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882806.16047: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882806.18238: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882806.18242: stdout chunk (state=3): >>><<< 22286 1726882806.18244: stderr chunk (state=3): >>><<< 22286 1726882806.18247: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882806.18250: handler run complete 22286 1726882806.18252: Evaluated conditional (False): False 22286 1726882806.18262: attempt loop complete, returning result 22286 1726882806.18265: _execute() done 22286 1726882806.18274: dumping result to json 22286 1726882806.18283: done dumping result, returning 22286 1726882806.18294: done running TaskExecutor() for managed_node3/TASK: Test gateway can be pinged [0affe814-3a2d-a75d-4836-000000000065] 22286 1726882806.18301: sending task result for task 0affe814-3a2d-a75d-4836-000000000065 22286 1726882806.18436: done sending task result for task 0affe814-3a2d-a75d-4836-000000000065 22286 1726882806.18440: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ping6", "-c1", "2001:db8::1" ], "delta": "0:00:00.006328", "end": "2024-09-20 21:40:06.130159", "rc": 0, "start": "2024-09-20 21:40:06.123831" } STDOUT: PING 2001:db8::1(2001:db8::1) 56 data bytes 64 bytes from 2001:db8::1: icmp_seq=1 ttl=64 time=0.063 ms --- 2001:db8::1 ping statistics --- 1 packets transmitted, 1 received, 0% packet loss, time 0ms rtt min/avg/max/mdev = 0.063/0.063/0.063/0.000 ms 22286 1726882806.18551: no more pending results, returning what we have 22286 1726882806.18647: results queue empty 22286 1726882806.18649: checking for any_errors_fatal 22286 1726882806.18669: done checking for any_errors_fatal 22286 1726882806.18670: checking for max_fail_percentage 22286 1726882806.18673: done checking for max_fail_percentage 22286 1726882806.18674: checking to see if all hosts have failed and the running result is not ok 22286 1726882806.18675: done checking to see if all hosts have failed 22286 1726882806.18676: getting the remaining hosts for this loop 22286 1726882806.18678: done getting the remaining hosts for this loop 22286 1726882806.18684: getting the next task for host managed_node3 22286 1726882806.18692: done getting next task for host managed_node3 22286 1726882806.18696: ^ task is: TASK: TEARDOWN: remove profiles. 22286 1726882806.18699: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882806.18703: getting variables 22286 1726882806.18705: in VariableManager get_vars() 22286 1726882806.18889: Calling all_inventory to load vars for managed_node3 22286 1726882806.18894: Calling groups_inventory to load vars for managed_node3 22286 1726882806.18897: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882806.18910: Calling all_plugins_play to load vars for managed_node3 22286 1726882806.18914: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882806.18919: Calling groups_plugins_play to load vars for managed_node3 22286 1726882806.21475: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882806.24829: done with get_vars() 22286 1726882806.24866: done getting variables 22286 1726882806.24939: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [TEARDOWN: remove profiles.] ********************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:92 Friday 20 September 2024 21:40:06 -0400 (0:00:00.507) 0:00:29.643 ****** 22286 1726882806.24971: entering _queue_task() for managed_node3/debug 22286 1726882806.25296: worker is 1 (out of 1 available) 22286 1726882806.25307: exiting _queue_task() for managed_node3/debug 22286 1726882806.25320: done queuing things up, now waiting for results queue to drain 22286 1726882806.25321: waiting for pending results... 22286 1726882806.25756: running TaskExecutor() for managed_node3/TASK: TEARDOWN: remove profiles. 22286 1726882806.25762: in run() - task 0affe814-3a2d-a75d-4836-000000000066 22286 1726882806.25765: variable 'ansible_search_path' from source: unknown 22286 1726882806.25798: calling self._execute() 22286 1726882806.25918: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882806.25932: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882806.25951: variable 'omit' from source: magic vars 22286 1726882806.26403: variable 'ansible_distribution_major_version' from source: facts 22286 1726882806.26422: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882806.26506: variable 'omit' from source: magic vars 22286 1726882806.26510: variable 'omit' from source: magic vars 22286 1726882806.26512: variable 'omit' from source: magic vars 22286 1726882806.26560: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22286 1726882806.26608: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22286 1726882806.26639: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22286 1726882806.26667: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882806.26689: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882806.26731: variable 'inventory_hostname' from source: host vars for 'managed_node3' 22286 1726882806.26833: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882806.26836: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882806.26894: Set connection var ansible_shell_executable to /bin/sh 22286 1726882806.26910: Set connection var ansible_module_compression to ZIP_DEFLATED 22286 1726882806.26918: Set connection var ansible_connection to ssh 22286 1726882806.26925: Set connection var ansible_shell_type to sh 22286 1726882806.26940: Set connection var ansible_timeout to 10 22286 1726882806.26956: Set connection var ansible_pipelining to False 22286 1726882806.26987: variable 'ansible_shell_executable' from source: unknown 22286 1726882806.26997: variable 'ansible_connection' from source: unknown 22286 1726882806.27005: variable 'ansible_module_compression' from source: unknown 22286 1726882806.27013: variable 'ansible_shell_type' from source: unknown 22286 1726882806.27021: variable 'ansible_shell_executable' from source: unknown 22286 1726882806.27029: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882806.27041: variable 'ansible_pipelining' from source: unknown 22286 1726882806.27052: variable 'ansible_timeout' from source: unknown 22286 1726882806.27062: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882806.27232: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22286 1726882806.27253: variable 'omit' from source: magic vars 22286 1726882806.27268: starting attempt loop 22286 1726882806.27278: running the handler 22286 1726882806.27377: handler run complete 22286 1726882806.27381: attempt loop complete, returning result 22286 1726882806.27383: _execute() done 22286 1726882806.27385: dumping result to json 22286 1726882806.27388: done dumping result, returning 22286 1726882806.27397: done running TaskExecutor() for managed_node3/TASK: TEARDOWN: remove profiles. [0affe814-3a2d-a75d-4836-000000000066] 22286 1726882806.27408: sending task result for task 0affe814-3a2d-a75d-4836-000000000066 ok: [managed_node3] => {} MSG: ################################################## 22286 1726882806.27560: no more pending results, returning what we have 22286 1726882806.27564: results queue empty 22286 1726882806.27565: checking for any_errors_fatal 22286 1726882806.27579: done checking for any_errors_fatal 22286 1726882806.27580: checking for max_fail_percentage 22286 1726882806.27583: done checking for max_fail_percentage 22286 1726882806.27584: checking to see if all hosts have failed and the running result is not ok 22286 1726882806.27585: done checking to see if all hosts have failed 22286 1726882806.27586: getting the remaining hosts for this loop 22286 1726882806.27588: done getting the remaining hosts for this loop 22286 1726882806.27593: getting the next task for host managed_node3 22286 1726882806.27603: done getting next task for host managed_node3 22286 1726882806.27610: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 22286 1726882806.27615: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22286 1726882806.27641: getting variables 22286 1726882806.27643: in VariableManager get_vars() 22286 1726882806.27692: Calling all_inventory to load vars for managed_node3 22286 1726882806.27695: Calling groups_inventory to load vars for managed_node3 22286 1726882806.27699: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882806.27711: Calling all_plugins_play to load vars for managed_node3 22286 1726882806.27714: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882806.27718: Calling groups_plugins_play to load vars for managed_node3 22286 1726882806.28521: done sending task result for task 0affe814-3a2d-a75d-4836-000000000066 22286 1726882806.28525: WORKER PROCESS EXITING 22286 1726882806.30314: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882806.34848: done with get_vars() 22286 1726882806.34893: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:40:06 -0400 (0:00:00.100) 0:00:29.743 ****** 22286 1726882806.35016: entering _queue_task() for managed_node3/include_tasks 22286 1726882806.35570: worker is 1 (out of 1 available) 22286 1726882806.35584: exiting _queue_task() for managed_node3/include_tasks 22286 1726882806.35597: done queuing things up, now waiting for results queue to drain 22286 1726882806.35599: waiting for pending results... 22286 1726882806.36355: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 22286 1726882806.36566: in run() - task 0affe814-3a2d-a75d-4836-00000000006e 22286 1726882806.36592: variable 'ansible_search_path' from source: unknown 22286 1726882806.36647: variable 'ansible_search_path' from source: unknown 22286 1726882806.36699: calling self._execute() 22286 1726882806.36958: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882806.36970: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882806.37009: variable 'omit' from source: magic vars 22286 1726882806.38046: variable 'ansible_distribution_major_version' from source: facts 22286 1726882806.38103: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882806.38106: _execute() done 22286 1726882806.38109: dumping result to json 22286 1726882806.38111: done dumping result, returning 22286 1726882806.38114: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affe814-3a2d-a75d-4836-00000000006e] 22286 1726882806.38116: sending task result for task 0affe814-3a2d-a75d-4836-00000000006e 22286 1726882806.38453: no more pending results, returning what we have 22286 1726882806.38458: in VariableManager get_vars() 22286 1726882806.38521: Calling all_inventory to load vars for managed_node3 22286 1726882806.38525: Calling groups_inventory to load vars for managed_node3 22286 1726882806.38528: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882806.38546: Calling all_plugins_play to load vars for managed_node3 22286 1726882806.38550: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882806.38555: Calling groups_plugins_play to load vars for managed_node3 22286 1726882806.39153: done sending task result for task 0affe814-3a2d-a75d-4836-00000000006e 22286 1726882806.39157: WORKER PROCESS EXITING 22286 1726882806.41430: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882806.44433: done with get_vars() 22286 1726882806.44467: variable 'ansible_search_path' from source: unknown 22286 1726882806.44468: variable 'ansible_search_path' from source: unknown 22286 1726882806.44520: we have included files to process 22286 1726882806.44521: generating all_blocks data 22286 1726882806.44524: done generating all_blocks data 22286 1726882806.44532: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 22286 1726882806.44533: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 22286 1726882806.44538: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 22286 1726882806.45467: done processing included file 22286 1726882806.45469: iterating over new_blocks loaded from include file 22286 1726882806.45471: in VariableManager get_vars() 22286 1726882806.45507: done with get_vars() 22286 1726882806.45509: filtering new block on tags 22286 1726882806.45531: done filtering new block on tags 22286 1726882806.45537: in VariableManager get_vars() 22286 1726882806.45580: done with get_vars() 22286 1726882806.45582: filtering new block on tags 22286 1726882806.45626: done filtering new block on tags 22286 1726882806.45630: in VariableManager get_vars() 22286 1726882806.45665: done with get_vars() 22286 1726882806.45667: filtering new block on tags 22286 1726882806.45695: done filtering new block on tags 22286 1726882806.45698: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 22286 1726882806.45704: extending task lists for all hosts with included blocks 22286 1726882806.46844: done extending task lists 22286 1726882806.46846: done processing included files 22286 1726882806.46847: results queue empty 22286 1726882806.46847: checking for any_errors_fatal 22286 1726882806.46851: done checking for any_errors_fatal 22286 1726882806.46852: checking for max_fail_percentage 22286 1726882806.46854: done checking for max_fail_percentage 22286 1726882806.46855: checking to see if all hosts have failed and the running result is not ok 22286 1726882806.46856: done checking to see if all hosts have failed 22286 1726882806.46857: getting the remaining hosts for this loop 22286 1726882806.46858: done getting the remaining hosts for this loop 22286 1726882806.46862: getting the next task for host managed_node3 22286 1726882806.46867: done getting next task for host managed_node3 22286 1726882806.46870: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 22286 1726882806.46874: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22286 1726882806.46889: getting variables 22286 1726882806.46890: in VariableManager get_vars() 22286 1726882806.46909: Calling all_inventory to load vars for managed_node3 22286 1726882806.46912: Calling groups_inventory to load vars for managed_node3 22286 1726882806.46915: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882806.46921: Calling all_plugins_play to load vars for managed_node3 22286 1726882806.46925: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882806.46929: Calling groups_plugins_play to load vars for managed_node3 22286 1726882806.48582: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882806.54305: done with get_vars() 22286 1726882806.54472: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:40:06 -0400 (0:00:00.195) 0:00:29.939 ****** 22286 1726882806.54604: entering _queue_task() for managed_node3/setup 22286 1726882806.55353: worker is 1 (out of 1 available) 22286 1726882806.55366: exiting _queue_task() for managed_node3/setup 22286 1726882806.55379: done queuing things up, now waiting for results queue to drain 22286 1726882806.55380: waiting for pending results... 22286 1726882806.55621: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 22286 1726882806.55758: in run() - task 0affe814-3a2d-a75d-4836-000000000513 22286 1726882806.55769: variable 'ansible_search_path' from source: unknown 22286 1726882806.55773: variable 'ansible_search_path' from source: unknown 22286 1726882806.55809: calling self._execute() 22286 1726882806.55901: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882806.55908: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882806.55918: variable 'omit' from source: magic vars 22286 1726882806.56246: variable 'ansible_distribution_major_version' from source: facts 22286 1726882806.56257: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882806.56453: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22286 1726882806.59580: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22286 1726882806.59797: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22286 1726882806.59801: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22286 1726882806.59806: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22286 1726882806.59809: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22286 1726882806.59924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22286 1726882806.59929: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22286 1726882806.59932: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882806.60015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22286 1726882806.60019: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22286 1726882806.60035: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22286 1726882806.60163: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22286 1726882806.60197: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882806.60332: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22286 1726882806.60377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22286 1726882806.60675: variable '__network_required_facts' from source: role '' defaults 22286 1726882806.60708: variable 'ansible_facts' from source: unknown 22286 1726882806.62055: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 22286 1726882806.62066: when evaluation is False, skipping this task 22286 1726882806.62074: _execute() done 22286 1726882806.62089: dumping result to json 22286 1726882806.62125: done dumping result, returning 22286 1726882806.62155: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affe814-3a2d-a75d-4836-000000000513] 22286 1726882806.62196: sending task result for task 0affe814-3a2d-a75d-4836-000000000513 skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 22286 1726882806.62376: no more pending results, returning what we have 22286 1726882806.62380: results queue empty 22286 1726882806.62381: checking for any_errors_fatal 22286 1726882806.62383: done checking for any_errors_fatal 22286 1726882806.62384: checking for max_fail_percentage 22286 1726882806.62386: done checking for max_fail_percentage 22286 1726882806.62388: checking to see if all hosts have failed and the running result is not ok 22286 1726882806.62389: done checking to see if all hosts have failed 22286 1726882806.62390: getting the remaining hosts for this loop 22286 1726882806.62392: done getting the remaining hosts for this loop 22286 1726882806.62396: getting the next task for host managed_node3 22286 1726882806.62409: done getting next task for host managed_node3 22286 1726882806.62535: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 22286 1726882806.62541: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22286 1726882806.62565: getting variables 22286 1726882806.62567: in VariableManager get_vars() 22286 1726882806.62619: Calling all_inventory to load vars for managed_node3 22286 1726882806.62623: Calling groups_inventory to load vars for managed_node3 22286 1726882806.62626: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882806.62782: Calling all_plugins_play to load vars for managed_node3 22286 1726882806.62787: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882806.62792: Calling groups_plugins_play to load vars for managed_node3 22286 1726882806.63358: done sending task result for task 0affe814-3a2d-a75d-4836-000000000513 22286 1726882806.63362: WORKER PROCESS EXITING 22286 1726882806.64469: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882806.67057: done with get_vars() 22286 1726882806.67081: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:40:06 -0400 (0:00:00.125) 0:00:30.065 ****** 22286 1726882806.67172: entering _queue_task() for managed_node3/stat 22286 1726882806.67417: worker is 1 (out of 1 available) 22286 1726882806.67431: exiting _queue_task() for managed_node3/stat 22286 1726882806.67446: done queuing things up, now waiting for results queue to drain 22286 1726882806.67448: waiting for pending results... 22286 1726882806.67641: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 22286 1726882806.67783: in run() - task 0affe814-3a2d-a75d-4836-000000000515 22286 1726882806.67799: variable 'ansible_search_path' from source: unknown 22286 1726882806.67803: variable 'ansible_search_path' from source: unknown 22286 1726882806.67838: calling self._execute() 22286 1726882806.67925: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882806.67932: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882806.67944: variable 'omit' from source: magic vars 22286 1726882806.68265: variable 'ansible_distribution_major_version' from source: facts 22286 1726882806.68277: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882806.68416: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22286 1726882806.68647: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22286 1726882806.68690: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22286 1726882806.68746: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22286 1726882806.68797: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22286 1726882806.68898: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22286 1726882806.68990: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22286 1726882806.68994: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882806.68997: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22286 1726882806.69147: variable '__network_is_ostree' from source: set_fact 22286 1726882806.69151: Evaluated conditional (not __network_is_ostree is defined): False 22286 1726882806.69155: when evaluation is False, skipping this task 22286 1726882806.69161: _execute() done 22286 1726882806.69164: dumping result to json 22286 1726882806.69166: done dumping result, returning 22286 1726882806.69169: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affe814-3a2d-a75d-4836-000000000515] 22286 1726882806.69171: sending task result for task 0affe814-3a2d-a75d-4836-000000000515 22286 1726882806.69237: done sending task result for task 0affe814-3a2d-a75d-4836-000000000515 22286 1726882806.69241: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 22286 1726882806.69324: no more pending results, returning what we have 22286 1726882806.69329: results queue empty 22286 1726882806.69331: checking for any_errors_fatal 22286 1726882806.69339: done checking for any_errors_fatal 22286 1726882806.69340: checking for max_fail_percentage 22286 1726882806.69343: done checking for max_fail_percentage 22286 1726882806.69344: checking to see if all hosts have failed and the running result is not ok 22286 1726882806.69345: done checking to see if all hosts have failed 22286 1726882806.69346: getting the remaining hosts for this loop 22286 1726882806.69348: done getting the remaining hosts for this loop 22286 1726882806.69353: getting the next task for host managed_node3 22286 1726882806.69362: done getting next task for host managed_node3 22286 1726882806.69367: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 22286 1726882806.69372: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22286 1726882806.69395: getting variables 22286 1726882806.69397: in VariableManager get_vars() 22286 1726882806.69452: Calling all_inventory to load vars for managed_node3 22286 1726882806.69456: Calling groups_inventory to load vars for managed_node3 22286 1726882806.69459: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882806.69481: Calling all_plugins_play to load vars for managed_node3 22286 1726882806.69485: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882806.69490: Calling groups_plugins_play to load vars for managed_node3 22286 1726882806.71282: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882806.73356: done with get_vars() 22286 1726882806.73383: done getting variables 22286 1726882806.73446: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:40:06 -0400 (0:00:00.063) 0:00:30.128 ****** 22286 1726882806.73476: entering _queue_task() for managed_node3/set_fact 22286 1726882806.73777: worker is 1 (out of 1 available) 22286 1726882806.73790: exiting _queue_task() for managed_node3/set_fact 22286 1726882806.73804: done queuing things up, now waiting for results queue to drain 22286 1726882806.73806: waiting for pending results... 22286 1726882806.74116: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 22286 1726882806.74310: in run() - task 0affe814-3a2d-a75d-4836-000000000516 22286 1726882806.74343: variable 'ansible_search_path' from source: unknown 22286 1726882806.74348: variable 'ansible_search_path' from source: unknown 22286 1726882806.74411: calling self._execute() 22286 1726882806.74489: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882806.74493: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882806.74523: variable 'omit' from source: magic vars 22286 1726882806.74860: variable 'ansible_distribution_major_version' from source: facts 22286 1726882806.74870: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882806.75011: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22286 1726882806.75235: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22286 1726882806.75273: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22286 1726882806.75326: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22286 1726882806.75360: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22286 1726882806.75433: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22286 1726882806.75457: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22286 1726882806.75482: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882806.75504: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22286 1726882806.75577: variable '__network_is_ostree' from source: set_fact 22286 1726882806.75581: Evaluated conditional (not __network_is_ostree is defined): False 22286 1726882806.75585: when evaluation is False, skipping this task 22286 1726882806.75588: _execute() done 22286 1726882806.75593: dumping result to json 22286 1726882806.75598: done dumping result, returning 22286 1726882806.75609: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affe814-3a2d-a75d-4836-000000000516] 22286 1726882806.75612: sending task result for task 0affe814-3a2d-a75d-4836-000000000516 22286 1726882806.75710: done sending task result for task 0affe814-3a2d-a75d-4836-000000000516 22286 1726882806.75714: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 22286 1726882806.75767: no more pending results, returning what we have 22286 1726882806.75770: results queue empty 22286 1726882806.75771: checking for any_errors_fatal 22286 1726882806.75779: done checking for any_errors_fatal 22286 1726882806.75780: checking for max_fail_percentage 22286 1726882806.75782: done checking for max_fail_percentage 22286 1726882806.75783: checking to see if all hosts have failed and the running result is not ok 22286 1726882806.75784: done checking to see if all hosts have failed 22286 1726882806.75785: getting the remaining hosts for this loop 22286 1726882806.75787: done getting the remaining hosts for this loop 22286 1726882806.75791: getting the next task for host managed_node3 22286 1726882806.75800: done getting next task for host managed_node3 22286 1726882806.75804: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 22286 1726882806.75808: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22286 1726882806.75825: getting variables 22286 1726882806.75826: in VariableManager get_vars() 22286 1726882806.75868: Calling all_inventory to load vars for managed_node3 22286 1726882806.75871: Calling groups_inventory to load vars for managed_node3 22286 1726882806.75873: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882806.75885: Calling all_plugins_play to load vars for managed_node3 22286 1726882806.75888: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882806.75892: Calling groups_plugins_play to load vars for managed_node3 22286 1726882806.77520: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882806.79326: done with get_vars() 22286 1726882806.79349: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:40:06 -0400 (0:00:00.059) 0:00:30.187 ****** 22286 1726882806.79426: entering _queue_task() for managed_node3/service_facts 22286 1726882806.79648: worker is 1 (out of 1 available) 22286 1726882806.79661: exiting _queue_task() for managed_node3/service_facts 22286 1726882806.79674: done queuing things up, now waiting for results queue to drain 22286 1726882806.79679: waiting for pending results... 22286 1726882806.79860: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 22286 1726882806.79982: in run() - task 0affe814-3a2d-a75d-4836-000000000518 22286 1726882806.79994: variable 'ansible_search_path' from source: unknown 22286 1726882806.79998: variable 'ansible_search_path' from source: unknown 22286 1726882806.80032: calling self._execute() 22286 1726882806.80111: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882806.80118: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882806.80131: variable 'omit' from source: magic vars 22286 1726882806.80439: variable 'ansible_distribution_major_version' from source: facts 22286 1726882806.80452: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882806.80464: variable 'omit' from source: magic vars 22286 1726882806.80527: variable 'omit' from source: magic vars 22286 1726882806.80560: variable 'omit' from source: magic vars 22286 1726882806.80597: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22286 1726882806.80627: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22286 1726882806.80646: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22286 1726882806.80664: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882806.80683: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882806.80706: variable 'inventory_hostname' from source: host vars for 'managed_node3' 22286 1726882806.80710: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882806.80713: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882806.80800: Set connection var ansible_shell_executable to /bin/sh 22286 1726882806.80808: Set connection var ansible_module_compression to ZIP_DEFLATED 22286 1726882806.80812: Set connection var ansible_connection to ssh 22286 1726882806.80815: Set connection var ansible_shell_type to sh 22286 1726882806.80822: Set connection var ansible_timeout to 10 22286 1726882806.80832: Set connection var ansible_pipelining to False 22286 1726882806.80852: variable 'ansible_shell_executable' from source: unknown 22286 1726882806.80855: variable 'ansible_connection' from source: unknown 22286 1726882806.80858: variable 'ansible_module_compression' from source: unknown 22286 1726882806.80862: variable 'ansible_shell_type' from source: unknown 22286 1726882806.80866: variable 'ansible_shell_executable' from source: unknown 22286 1726882806.80870: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882806.80877: variable 'ansible_pipelining' from source: unknown 22286 1726882806.80880: variable 'ansible_timeout' from source: unknown 22286 1726882806.80883: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882806.81048: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 22286 1726882806.81058: variable 'omit' from source: magic vars 22286 1726882806.81063: starting attempt loop 22286 1726882806.81066: running the handler 22286 1726882806.81080: _low_level_execute_command(): starting 22286 1726882806.81088: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22286 1726882806.81615: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882806.81619: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 22286 1726882806.81623: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882806.81671: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882806.81675: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882806.81806: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882806.83695: stdout chunk (state=3): >>>/root <<< 22286 1726882806.83801: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882806.83851: stderr chunk (state=3): >>><<< 22286 1726882806.83855: stdout chunk (state=3): >>><<< 22286 1726882806.83875: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882806.83887: _low_level_execute_command(): starting 22286 1726882806.83895: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882806.8387465-23367-129919547587617 `" && echo ansible-tmp-1726882806.8387465-23367-129919547587617="` echo /root/.ansible/tmp/ansible-tmp-1726882806.8387465-23367-129919547587617 `" ) && sleep 0' 22286 1726882806.84304: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882806.84339: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 <<< 22286 1726882806.84343: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882806.84345: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882806.84355: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found <<< 22286 1726882806.84358: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882806.84406: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882806.84409: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882806.84530: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882806.87022: stdout chunk (state=3): >>>ansible-tmp-1726882806.8387465-23367-129919547587617=/root/.ansible/tmp/ansible-tmp-1726882806.8387465-23367-129919547587617 <<< 22286 1726882806.87117: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882806.87120: stdout chunk (state=3): >>><<< 22286 1726882806.87123: stderr chunk (state=3): >>><<< 22286 1726882806.87139: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882806.8387465-23367-129919547587617=/root/.ansible/tmp/ansible-tmp-1726882806.8387465-23367-129919547587617 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882806.87188: variable 'ansible_module_compression' from source: unknown 22286 1726882806.87333: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22286nvkga28q/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 22286 1726882806.87336: variable 'ansible_facts' from source: unknown 22286 1726882806.87371: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882806.8387465-23367-129919547587617/AnsiballZ_service_facts.py 22286 1726882806.87599: Sending initial data 22286 1726882806.87613: Sent initial data (162 bytes) 22286 1726882806.88214: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 22286 1726882806.88243: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882806.88259: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882806.88406: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882806.90169: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22286 1726882806.90302: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22286 1726882806.90448: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22286nvkga28q/tmp7uk3_t9e /root/.ansible/tmp/ansible-tmp-1726882806.8387465-23367-129919547587617/AnsiballZ_service_facts.py <<< 22286 1726882806.90470: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882806.8387465-23367-129919547587617/AnsiballZ_service_facts.py" <<< 22286 1726882806.90560: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-22286nvkga28q/tmp7uk3_t9e" to remote "/root/.ansible/tmp/ansible-tmp-1726882806.8387465-23367-129919547587617/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882806.8387465-23367-129919547587617/AnsiballZ_service_facts.py" <<< 22286 1726882806.92113: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882806.92153: stderr chunk (state=3): >>><<< 22286 1726882806.92166: stdout chunk (state=3): >>><<< 22286 1726882806.92192: done transferring module to remote 22286 1726882806.92281: _low_level_execute_command(): starting 22286 1726882806.92285: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882806.8387465-23367-129919547587617/ /root/.ansible/tmp/ansible-tmp-1726882806.8387465-23367-129919547587617/AnsiballZ_service_facts.py && sleep 0' 22286 1726882806.92853: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882806.92908: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882806.92951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882806.93038: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882806.93061: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882806.93205: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882806.95297: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882806.95300: stdout chunk (state=3): >>><<< 22286 1726882806.95303: stderr chunk (state=3): >>><<< 22286 1726882806.95321: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882806.95340: _low_level_execute_command(): starting 22286 1726882806.95421: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882806.8387465-23367-129919547587617/AnsiballZ_service_facts.py && sleep 0' 22286 1726882806.96026: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 22286 1726882806.96044: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882806.96058: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882806.96212: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882808.89442: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name"<<< 22286 1726882808.89496: stdout chunk (state=3): >>>: "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 22286 1726882808.91142: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882808.91248: stderr chunk (state=3): >>>Shared connection to 10.31.41.238 closed. <<< 22286 1726882808.91266: stderr chunk (state=3): >>><<< 22286 1726882808.91283: stdout chunk (state=3): >>><<< 22286 1726882808.91441: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. 22286 1726882808.92573: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882806.8387465-23367-129919547587617/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22286 1726882808.92593: _low_level_execute_command(): starting 22286 1726882808.92605: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882806.8387465-23367-129919547587617/ > /dev/null 2>&1 && sleep 0' 22286 1726882808.93895: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882808.93912: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882808.94013: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882808.94169: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882808.94280: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882808.96349: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882808.96740: stderr chunk (state=3): >>><<< 22286 1726882808.96743: stdout chunk (state=3): >>><<< 22286 1726882808.96746: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882808.96748: handler run complete 22286 1726882808.97054: variable 'ansible_facts' from source: unknown 22286 1726882808.97498: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882808.98707: variable 'ansible_facts' from source: unknown 22286 1726882808.98817: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882808.99196: attempt loop complete, returning result 22286 1726882808.99204: _execute() done 22286 1726882808.99207: dumping result to json 22286 1726882808.99319: done dumping result, returning 22286 1726882808.99331: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [0affe814-3a2d-a75d-4836-000000000518] 22286 1726882808.99339: sending task result for task 0affe814-3a2d-a75d-4836-000000000518 ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 22286 1726882809.00615: no more pending results, returning what we have 22286 1726882809.00618: results queue empty 22286 1726882809.00619: checking for any_errors_fatal 22286 1726882809.00624: done checking for any_errors_fatal 22286 1726882809.00625: checking for max_fail_percentage 22286 1726882809.00627: done checking for max_fail_percentage 22286 1726882809.00628: checking to see if all hosts have failed and the running result is not ok 22286 1726882809.00630: done checking to see if all hosts have failed 22286 1726882809.00631: getting the remaining hosts for this loop 22286 1726882809.00632: done getting the remaining hosts for this loop 22286 1726882809.00638: getting the next task for host managed_node3 22286 1726882809.00645: done getting next task for host managed_node3 22286 1726882809.00649: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 22286 1726882809.00654: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22286 1726882809.00666: getting variables 22286 1726882809.00667: in VariableManager get_vars() 22286 1726882809.00706: Calling all_inventory to load vars for managed_node3 22286 1726882809.00710: Calling groups_inventory to load vars for managed_node3 22286 1726882809.00713: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882809.00724: Calling all_plugins_play to load vars for managed_node3 22286 1726882809.00727: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882809.00731: Calling groups_plugins_play to load vars for managed_node3 22286 1726882809.00748: done sending task result for task 0affe814-3a2d-a75d-4836-000000000518 22286 1726882809.00751: WORKER PROCESS EXITING 22286 1726882809.04897: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882809.08152: done with get_vars() 22286 1726882809.08208: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:40:09 -0400 (0:00:02.289) 0:00:32.477 ****** 22286 1726882809.08399: entering _queue_task() for managed_node3/package_facts 22286 1726882809.08979: worker is 1 (out of 1 available) 22286 1726882809.08991: exiting _queue_task() for managed_node3/package_facts 22286 1726882809.09004: done queuing things up, now waiting for results queue to drain 22286 1726882809.09006: waiting for pending results... 22286 1726882809.09301: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 22286 1726882809.09541: in run() - task 0affe814-3a2d-a75d-4836-000000000519 22286 1726882809.09545: variable 'ansible_search_path' from source: unknown 22286 1726882809.09549: variable 'ansible_search_path' from source: unknown 22286 1726882809.09597: calling self._execute() 22286 1726882809.09753: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882809.09769: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882809.09789: variable 'omit' from source: magic vars 22286 1726882809.10466: variable 'ansible_distribution_major_version' from source: facts 22286 1726882809.10494: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882809.10508: variable 'omit' from source: magic vars 22286 1726882809.10699: variable 'omit' from source: magic vars 22286 1726882809.10703: variable 'omit' from source: magic vars 22286 1726882809.10714: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22286 1726882809.10761: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22286 1726882809.10788: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22286 1726882809.10821: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882809.10841: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882809.10882: variable 'inventory_hostname' from source: host vars for 'managed_node3' 22286 1726882809.10892: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882809.10902: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882809.11049: Set connection var ansible_shell_executable to /bin/sh 22286 1726882809.11067: Set connection var ansible_module_compression to ZIP_DEFLATED 22286 1726882809.11075: Set connection var ansible_connection to ssh 22286 1726882809.11083: Set connection var ansible_shell_type to sh 22286 1726882809.11094: Set connection var ansible_timeout to 10 22286 1726882809.11109: Set connection var ansible_pipelining to False 22286 1726882809.11147: variable 'ansible_shell_executable' from source: unknown 22286 1726882809.11157: variable 'ansible_connection' from source: unknown 22286 1726882809.11245: variable 'ansible_module_compression' from source: unknown 22286 1726882809.11248: variable 'ansible_shell_type' from source: unknown 22286 1726882809.11251: variable 'ansible_shell_executable' from source: unknown 22286 1726882809.11253: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882809.11255: variable 'ansible_pipelining' from source: unknown 22286 1726882809.11258: variable 'ansible_timeout' from source: unknown 22286 1726882809.11260: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882809.11444: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 22286 1726882809.11469: variable 'omit' from source: magic vars 22286 1726882809.11480: starting attempt loop 22286 1726882809.11488: running the handler 22286 1726882809.11507: _low_level_execute_command(): starting 22286 1726882809.11519: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22286 1726882809.12259: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882809.12276: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882809.12336: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882809.12412: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882809.12439: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882809.12485: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882809.12610: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882809.14475: stdout chunk (state=3): >>>/root <<< 22286 1726882809.14658: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882809.14662: stdout chunk (state=3): >>><<< 22286 1726882809.14664: stderr chunk (state=3): >>><<< 22286 1726882809.14781: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882809.14786: _low_level_execute_command(): starting 22286 1726882809.14789: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882809.1468723-23467-92936169576954 `" && echo ansible-tmp-1726882809.1468723-23467-92936169576954="` echo /root/.ansible/tmp/ansible-tmp-1726882809.1468723-23467-92936169576954 `" ) && sleep 0' 22286 1726882809.15319: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882809.15337: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882809.15353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882809.15382: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22286 1726882809.15401: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 <<< 22286 1726882809.15415: stderr chunk (state=3): >>>debug2: match not found <<< 22286 1726882809.15439: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882809.15459: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 22286 1726882809.15492: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found <<< 22286 1726882809.15578: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 22286 1726882809.15606: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882809.15624: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882809.15774: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882809.17966: stdout chunk (state=3): >>>ansible-tmp-1726882809.1468723-23467-92936169576954=/root/.ansible/tmp/ansible-tmp-1726882809.1468723-23467-92936169576954 <<< 22286 1726882809.18152: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882809.18239: stdout chunk (state=3): >>><<< 22286 1726882809.18243: stderr chunk (state=3): >>><<< 22286 1726882809.18246: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882809.1468723-23467-92936169576954=/root/.ansible/tmp/ansible-tmp-1726882809.1468723-23467-92936169576954 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882809.18248: variable 'ansible_module_compression' from source: unknown 22286 1726882809.18299: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22286nvkga28q/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 22286 1726882809.18362: variable 'ansible_facts' from source: unknown 22286 1726882809.18592: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882809.1468723-23467-92936169576954/AnsiballZ_package_facts.py 22286 1726882809.18781: Sending initial data 22286 1726882809.18791: Sent initial data (161 bytes) 22286 1726882809.19400: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882809.19424: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882809.19539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882809.19567: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882809.19714: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882809.21552: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22286 1726882809.21666: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22286 1726882809.21812: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22286nvkga28q/tmpzgbsqznx /root/.ansible/tmp/ansible-tmp-1726882809.1468723-23467-92936169576954/AnsiballZ_package_facts.py <<< 22286 1726882809.21816: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882809.1468723-23467-92936169576954/AnsiballZ_package_facts.py" <<< 22286 1726882809.21951: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-22286nvkga28q/tmpzgbsqznx" to remote "/root/.ansible/tmp/ansible-tmp-1726882809.1468723-23467-92936169576954/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882809.1468723-23467-92936169576954/AnsiballZ_package_facts.py" <<< 22286 1726882809.24164: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882809.24168: stderr chunk (state=3): >>><<< 22286 1726882809.24179: stdout chunk (state=3): >>><<< 22286 1726882809.24440: done transferring module to remote 22286 1726882809.24443: _low_level_execute_command(): starting 22286 1726882809.24446: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882809.1468723-23467-92936169576954/ /root/.ansible/tmp/ansible-tmp-1726882809.1468723-23467-92936169576954/AnsiballZ_package_facts.py && sleep 0' 22286 1726882809.24798: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882809.24818: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882809.24854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882809.24878: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22286 1726882809.24895: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 <<< 22286 1726882809.24899: stderr chunk (state=3): >>>debug2: match not found <<< 22286 1726882809.24907: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882809.24931: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 22286 1726882809.24941: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.41.238 is address <<< 22286 1726882809.24949: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 22286 1726882809.24972: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882809.25052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882809.25083: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882809.25219: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882809.27249: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882809.27283: stderr chunk (state=3): >>><<< 22286 1726882809.27287: stdout chunk (state=3): >>><<< 22286 1726882809.27308: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882809.27339: _low_level_execute_command(): starting 22286 1726882809.27343: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882809.1468723-23467-92936169576954/AnsiballZ_package_facts.py && sleep 0' 22286 1726882809.27900: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882809.27917: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882809.27936: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882809.27956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22286 1726882809.28056: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882809.28072: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882809.28092: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882809.28213: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882809.92344: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "a<<< 22286 1726882809.92387: stdout chunk (state=3): >>>rch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": <<< 22286 1726882809.92398: stdout chunk (state=3): >>>"rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-<<< 22286 1726882809.92440: stdout chunk (state=3): >>>libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils",<<< 22286 1726882809.92463: stdout chunk (state=3): >>> "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "<<< 22286 1726882809.92515: stdout chunk (state=3): >>>version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", <<< 22286 1726882809.92527: stdout chunk (state=3): >>>"release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "sou<<< 22286 1726882809.92533: stdout chunk (state=3): >>>rce": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "a<<< 22286 1726882809.92561: stdout chunk (state=3): >>>spell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "n<<< 22286 1726882809.92584: stdout chunk (state=3): >>>oarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", <<< 22286 1726882809.92593: stdout chunk (state=3): >>>"source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 22286 1726882809.94637: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. <<< 22286 1726882809.94640: stdout chunk (state=3): >>><<< 22286 1726882809.94643: stderr chunk (state=3): >>><<< 22286 1726882809.94848: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. 22286 1726882809.97656: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882809.1468723-23467-92936169576954/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22286 1726882809.97676: _low_level_execute_command(): starting 22286 1726882809.97686: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882809.1468723-23467-92936169576954/ > /dev/null 2>&1 && sleep 0' 22286 1726882809.98171: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882809.98174: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found <<< 22286 1726882809.98180: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882809.98183: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882809.98185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882809.98241: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882809.98247: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882809.98250: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882809.98363: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882810.00430: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882810.00486: stderr chunk (state=3): >>><<< 22286 1726882810.00490: stdout chunk (state=3): >>><<< 22286 1726882810.00506: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882810.00514: handler run complete 22286 1726882810.01325: variable 'ansible_facts' from source: unknown 22286 1726882810.01788: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882810.03800: variable 'ansible_facts' from source: unknown 22286 1726882810.04233: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882810.04994: attempt loop complete, returning result 22286 1726882810.05009: _execute() done 22286 1726882810.05012: dumping result to json 22286 1726882810.05194: done dumping result, returning 22286 1726882810.05203: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affe814-3a2d-a75d-4836-000000000519] 22286 1726882810.05209: sending task result for task 0affe814-3a2d-a75d-4836-000000000519 22286 1726882810.07250: done sending task result for task 0affe814-3a2d-a75d-4836-000000000519 22286 1726882810.07253: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 22286 1726882810.07359: no more pending results, returning what we have 22286 1726882810.07362: results queue empty 22286 1726882810.07362: checking for any_errors_fatal 22286 1726882810.07366: done checking for any_errors_fatal 22286 1726882810.07367: checking for max_fail_percentage 22286 1726882810.07368: done checking for max_fail_percentage 22286 1726882810.07369: checking to see if all hosts have failed and the running result is not ok 22286 1726882810.07370: done checking to see if all hosts have failed 22286 1726882810.07370: getting the remaining hosts for this loop 22286 1726882810.07371: done getting the remaining hosts for this loop 22286 1726882810.07375: getting the next task for host managed_node3 22286 1726882810.07382: done getting next task for host managed_node3 22286 1726882810.07386: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 22286 1726882810.07388: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22286 1726882810.07396: getting variables 22286 1726882810.07397: in VariableManager get_vars() 22286 1726882810.07427: Calling all_inventory to load vars for managed_node3 22286 1726882810.07429: Calling groups_inventory to load vars for managed_node3 22286 1726882810.07431: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882810.07440: Calling all_plugins_play to load vars for managed_node3 22286 1726882810.07443: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882810.07445: Calling groups_plugins_play to load vars for managed_node3 22286 1726882810.08614: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882810.10177: done with get_vars() 22286 1726882810.10201: done getting variables 22286 1726882810.10257: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:40:10 -0400 (0:00:01.018) 0:00:33.496 ****** 22286 1726882810.10293: entering _queue_task() for managed_node3/debug 22286 1726882810.10559: worker is 1 (out of 1 available) 22286 1726882810.10573: exiting _queue_task() for managed_node3/debug 22286 1726882810.10590: done queuing things up, now waiting for results queue to drain 22286 1726882810.10592: waiting for pending results... 22286 1726882810.10790: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 22286 1726882810.10902: in run() - task 0affe814-3a2d-a75d-4836-00000000006f 22286 1726882810.10917: variable 'ansible_search_path' from source: unknown 22286 1726882810.10920: variable 'ansible_search_path' from source: unknown 22286 1726882810.10957: calling self._execute() 22286 1726882810.11045: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882810.11049: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882810.11059: variable 'omit' from source: magic vars 22286 1726882810.11383: variable 'ansible_distribution_major_version' from source: facts 22286 1726882810.11395: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882810.11402: variable 'omit' from source: magic vars 22286 1726882810.11449: variable 'omit' from source: magic vars 22286 1726882810.11531: variable 'network_provider' from source: set_fact 22286 1726882810.11548: variable 'omit' from source: magic vars 22286 1726882810.11588: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22286 1726882810.11621: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22286 1726882810.11641: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22286 1726882810.11658: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882810.11669: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882810.11705: variable 'inventory_hostname' from source: host vars for 'managed_node3' 22286 1726882810.11708: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882810.11711: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882810.11795: Set connection var ansible_shell_executable to /bin/sh 22286 1726882810.11803: Set connection var ansible_module_compression to ZIP_DEFLATED 22286 1726882810.11808: Set connection var ansible_connection to ssh 22286 1726882810.11811: Set connection var ansible_shell_type to sh 22286 1726882810.11818: Set connection var ansible_timeout to 10 22286 1726882810.11827: Set connection var ansible_pipelining to False 22286 1726882810.11849: variable 'ansible_shell_executable' from source: unknown 22286 1726882810.11853: variable 'ansible_connection' from source: unknown 22286 1726882810.11856: variable 'ansible_module_compression' from source: unknown 22286 1726882810.11858: variable 'ansible_shell_type' from source: unknown 22286 1726882810.11862: variable 'ansible_shell_executable' from source: unknown 22286 1726882810.11866: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882810.11872: variable 'ansible_pipelining' from source: unknown 22286 1726882810.11875: variable 'ansible_timeout' from source: unknown 22286 1726882810.11880: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882810.11998: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22286 1726882810.12009: variable 'omit' from source: magic vars 22286 1726882810.12015: starting attempt loop 22286 1726882810.12019: running the handler 22286 1726882810.12062: handler run complete 22286 1726882810.12078: attempt loop complete, returning result 22286 1726882810.12082: _execute() done 22286 1726882810.12085: dumping result to json 22286 1726882810.12088: done dumping result, returning 22286 1726882810.12096: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [0affe814-3a2d-a75d-4836-00000000006f] 22286 1726882810.12102: sending task result for task 0affe814-3a2d-a75d-4836-00000000006f 22286 1726882810.12191: done sending task result for task 0affe814-3a2d-a75d-4836-00000000006f 22286 1726882810.12195: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: Using network provider: nm 22286 1726882810.12268: no more pending results, returning what we have 22286 1726882810.12273: results queue empty 22286 1726882810.12273: checking for any_errors_fatal 22286 1726882810.12285: done checking for any_errors_fatal 22286 1726882810.12286: checking for max_fail_percentage 22286 1726882810.12288: done checking for max_fail_percentage 22286 1726882810.12289: checking to see if all hosts have failed and the running result is not ok 22286 1726882810.12290: done checking to see if all hosts have failed 22286 1726882810.12291: getting the remaining hosts for this loop 22286 1726882810.12293: done getting the remaining hosts for this loop 22286 1726882810.12298: getting the next task for host managed_node3 22286 1726882810.12309: done getting next task for host managed_node3 22286 1726882810.12314: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 22286 1726882810.12317: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22286 1726882810.12330: getting variables 22286 1726882810.12331: in VariableManager get_vars() 22286 1726882810.12371: Calling all_inventory to load vars for managed_node3 22286 1726882810.12374: Calling groups_inventory to load vars for managed_node3 22286 1726882810.12379: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882810.12388: Calling all_plugins_play to load vars for managed_node3 22286 1726882810.12391: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882810.12395: Calling groups_plugins_play to load vars for managed_node3 22286 1726882810.13610: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882810.15279: done with get_vars() 22286 1726882810.15302: done getting variables 22286 1726882810.15354: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:40:10 -0400 (0:00:00.050) 0:00:33.547 ****** 22286 1726882810.15384: entering _queue_task() for managed_node3/fail 22286 1726882810.15640: worker is 1 (out of 1 available) 22286 1726882810.15655: exiting _queue_task() for managed_node3/fail 22286 1726882810.15667: done queuing things up, now waiting for results queue to drain 22286 1726882810.15669: waiting for pending results... 22286 1726882810.15858: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 22286 1726882810.15971: in run() - task 0affe814-3a2d-a75d-4836-000000000070 22286 1726882810.15984: variable 'ansible_search_path' from source: unknown 22286 1726882810.15988: variable 'ansible_search_path' from source: unknown 22286 1726882810.16022: calling self._execute() 22286 1726882810.16104: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882810.16112: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882810.16122: variable 'omit' from source: magic vars 22286 1726882810.16450: variable 'ansible_distribution_major_version' from source: facts 22286 1726882810.16461: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882810.16568: variable 'network_state' from source: role '' defaults 22286 1726882810.16576: Evaluated conditional (network_state != {}): False 22286 1726882810.16582: when evaluation is False, skipping this task 22286 1726882810.16586: _execute() done 22286 1726882810.16589: dumping result to json 22286 1726882810.16595: done dumping result, returning 22286 1726882810.16603: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affe814-3a2d-a75d-4836-000000000070] 22286 1726882810.16609: sending task result for task 0affe814-3a2d-a75d-4836-000000000070 22286 1726882810.16703: done sending task result for task 0affe814-3a2d-a75d-4836-000000000070 22286 1726882810.16706: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 22286 1726882810.16762: no more pending results, returning what we have 22286 1726882810.16766: results queue empty 22286 1726882810.16767: checking for any_errors_fatal 22286 1726882810.16774: done checking for any_errors_fatal 22286 1726882810.16775: checking for max_fail_percentage 22286 1726882810.16777: done checking for max_fail_percentage 22286 1726882810.16779: checking to see if all hosts have failed and the running result is not ok 22286 1726882810.16780: done checking to see if all hosts have failed 22286 1726882810.16781: getting the remaining hosts for this loop 22286 1726882810.16782: done getting the remaining hosts for this loop 22286 1726882810.16787: getting the next task for host managed_node3 22286 1726882810.16794: done getting next task for host managed_node3 22286 1726882810.16798: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 22286 1726882810.16802: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22286 1726882810.16820: getting variables 22286 1726882810.16822: in VariableManager get_vars() 22286 1726882810.16862: Calling all_inventory to load vars for managed_node3 22286 1726882810.16865: Calling groups_inventory to load vars for managed_node3 22286 1726882810.16867: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882810.16877: Calling all_plugins_play to load vars for managed_node3 22286 1726882810.16880: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882810.16884: Calling groups_plugins_play to load vars for managed_node3 22286 1726882810.18053: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882810.19599: done with get_vars() 22286 1726882810.19621: done getting variables 22286 1726882810.19669: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:40:10 -0400 (0:00:00.043) 0:00:33.590 ****** 22286 1726882810.19697: entering _queue_task() for managed_node3/fail 22286 1726882810.19929: worker is 1 (out of 1 available) 22286 1726882810.19945: exiting _queue_task() for managed_node3/fail 22286 1726882810.19960: done queuing things up, now waiting for results queue to drain 22286 1726882810.19962: waiting for pending results... 22286 1726882810.20146: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 22286 1726882810.20243: in run() - task 0affe814-3a2d-a75d-4836-000000000071 22286 1726882810.20256: variable 'ansible_search_path' from source: unknown 22286 1726882810.20260: variable 'ansible_search_path' from source: unknown 22286 1726882810.20295: calling self._execute() 22286 1726882810.20375: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882810.20384: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882810.20394: variable 'omit' from source: magic vars 22286 1726882810.20710: variable 'ansible_distribution_major_version' from source: facts 22286 1726882810.20720: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882810.20825: variable 'network_state' from source: role '' defaults 22286 1726882810.20839: Evaluated conditional (network_state != {}): False 22286 1726882810.20843: when evaluation is False, skipping this task 22286 1726882810.20848: _execute() done 22286 1726882810.20851: dumping result to json 22286 1726882810.20853: done dumping result, returning 22286 1726882810.20864: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affe814-3a2d-a75d-4836-000000000071] 22286 1726882810.20870: sending task result for task 0affe814-3a2d-a75d-4836-000000000071 22286 1726882810.20966: done sending task result for task 0affe814-3a2d-a75d-4836-000000000071 22286 1726882810.20969: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 22286 1726882810.21023: no more pending results, returning what we have 22286 1726882810.21026: results queue empty 22286 1726882810.21027: checking for any_errors_fatal 22286 1726882810.21036: done checking for any_errors_fatal 22286 1726882810.21037: checking for max_fail_percentage 22286 1726882810.21040: done checking for max_fail_percentage 22286 1726882810.21041: checking to see if all hosts have failed and the running result is not ok 22286 1726882810.21042: done checking to see if all hosts have failed 22286 1726882810.21043: getting the remaining hosts for this loop 22286 1726882810.21044: done getting the remaining hosts for this loop 22286 1726882810.21048: getting the next task for host managed_node3 22286 1726882810.21054: done getting next task for host managed_node3 22286 1726882810.21058: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 22286 1726882810.21061: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22286 1726882810.21082: getting variables 22286 1726882810.21084: in VariableManager get_vars() 22286 1726882810.21120: Calling all_inventory to load vars for managed_node3 22286 1726882810.21124: Calling groups_inventory to load vars for managed_node3 22286 1726882810.21126: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882810.21140: Calling all_plugins_play to load vars for managed_node3 22286 1726882810.21144: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882810.21147: Calling groups_plugins_play to load vars for managed_node3 22286 1726882810.22445: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882810.24003: done with get_vars() 22286 1726882810.24024: done getting variables 22286 1726882810.24072: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:40:10 -0400 (0:00:00.044) 0:00:33.634 ****** 22286 1726882810.24100: entering _queue_task() for managed_node3/fail 22286 1726882810.24309: worker is 1 (out of 1 available) 22286 1726882810.24320: exiting _queue_task() for managed_node3/fail 22286 1726882810.24333: done queuing things up, now waiting for results queue to drain 22286 1726882810.24336: waiting for pending results... 22286 1726882810.24523: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 22286 1726882810.24642: in run() - task 0affe814-3a2d-a75d-4836-000000000072 22286 1726882810.24656: variable 'ansible_search_path' from source: unknown 22286 1726882810.24659: variable 'ansible_search_path' from source: unknown 22286 1726882810.24693: calling self._execute() 22286 1726882810.24772: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882810.24780: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882810.24792: variable 'omit' from source: magic vars 22286 1726882810.25093: variable 'ansible_distribution_major_version' from source: facts 22286 1726882810.25103: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882810.25278: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22286 1726882810.26993: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22286 1726882810.27048: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22286 1726882810.27080: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22286 1726882810.27111: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22286 1726882810.27136: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22286 1726882810.27204: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22286 1726882810.27239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22286 1726882810.27262: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882810.27297: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22286 1726882810.27312: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22286 1726882810.27390: variable 'ansible_distribution_major_version' from source: facts 22286 1726882810.27402: Evaluated conditional (ansible_distribution_major_version | int > 9): True 22286 1726882810.27500: variable 'ansible_distribution' from source: facts 22286 1726882810.27504: variable '__network_rh_distros' from source: role '' defaults 22286 1726882810.27513: Evaluated conditional (ansible_distribution in __network_rh_distros): False 22286 1726882810.27516: when evaluation is False, skipping this task 22286 1726882810.27519: _execute() done 22286 1726882810.27525: dumping result to json 22286 1726882810.27528: done dumping result, returning 22286 1726882810.27537: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affe814-3a2d-a75d-4836-000000000072] 22286 1726882810.27544: sending task result for task 0affe814-3a2d-a75d-4836-000000000072 22286 1726882810.27629: done sending task result for task 0affe814-3a2d-a75d-4836-000000000072 22286 1726882810.27632: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 22286 1726882810.27698: no more pending results, returning what we have 22286 1726882810.27701: results queue empty 22286 1726882810.27702: checking for any_errors_fatal 22286 1726882810.27707: done checking for any_errors_fatal 22286 1726882810.27708: checking for max_fail_percentage 22286 1726882810.27710: done checking for max_fail_percentage 22286 1726882810.27711: checking to see if all hosts have failed and the running result is not ok 22286 1726882810.27713: done checking to see if all hosts have failed 22286 1726882810.27713: getting the remaining hosts for this loop 22286 1726882810.27716: done getting the remaining hosts for this loop 22286 1726882810.27719: getting the next task for host managed_node3 22286 1726882810.27725: done getting next task for host managed_node3 22286 1726882810.27730: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 22286 1726882810.27733: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22286 1726882810.27754: getting variables 22286 1726882810.27755: in VariableManager get_vars() 22286 1726882810.27796: Calling all_inventory to load vars for managed_node3 22286 1726882810.27799: Calling groups_inventory to load vars for managed_node3 22286 1726882810.27801: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882810.27810: Calling all_plugins_play to load vars for managed_node3 22286 1726882810.27814: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882810.27817: Calling groups_plugins_play to load vars for managed_node3 22286 1726882810.28997: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882810.30980: done with get_vars() 22286 1726882810.31013: done getting variables 22286 1726882810.31089: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:40:10 -0400 (0:00:00.070) 0:00:33.704 ****** 22286 1726882810.31126: entering _queue_task() for managed_node3/dnf 22286 1726882810.31440: worker is 1 (out of 1 available) 22286 1726882810.31455: exiting _queue_task() for managed_node3/dnf 22286 1726882810.31469: done queuing things up, now waiting for results queue to drain 22286 1726882810.31470: waiting for pending results... 22286 1726882810.31660: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 22286 1726882810.31780: in run() - task 0affe814-3a2d-a75d-4836-000000000073 22286 1726882810.31788: variable 'ansible_search_path' from source: unknown 22286 1726882810.31792: variable 'ansible_search_path' from source: unknown 22286 1726882810.31825: calling self._execute() 22286 1726882810.31916: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882810.31923: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882810.31937: variable 'omit' from source: magic vars 22286 1726882810.32256: variable 'ansible_distribution_major_version' from source: facts 22286 1726882810.32266: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882810.32439: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22286 1726882810.34539: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22286 1726882810.34543: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22286 1726882810.34545: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22286 1726882810.34547: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22286 1726882810.34559: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22286 1726882810.34667: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22286 1726882810.34695: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22286 1726882810.34716: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882810.34751: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22286 1726882810.34763: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22286 1726882810.34861: variable 'ansible_distribution' from source: facts 22286 1726882810.34865: variable 'ansible_distribution_major_version' from source: facts 22286 1726882810.34872: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 22286 1726882810.34968: variable '__network_wireless_connections_defined' from source: role '' defaults 22286 1726882810.35083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22286 1726882810.35103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22286 1726882810.35128: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882810.35161: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22286 1726882810.35173: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22286 1726882810.35210: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22286 1726882810.35232: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22286 1726882810.35254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882810.35288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22286 1726882810.35300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22286 1726882810.35337: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22286 1726882810.35357: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22286 1726882810.35377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882810.35409: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22286 1726882810.35420: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22286 1726882810.35555: variable 'network_connections' from source: task vars 22286 1726882810.35563: variable 'interface' from source: play vars 22286 1726882810.35616: variable 'interface' from source: play vars 22286 1726882810.35679: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22286 1726882810.35813: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22286 1726882810.35849: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22286 1726882810.35880: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22286 1726882810.35903: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22286 1726882810.35940: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22286 1726882810.35959: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22286 1726882810.35989: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882810.36010: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22286 1726882810.36053: variable '__network_team_connections_defined' from source: role '' defaults 22286 1726882810.36263: variable 'network_connections' from source: task vars 22286 1726882810.36269: variable 'interface' from source: play vars 22286 1726882810.36323: variable 'interface' from source: play vars 22286 1726882810.36345: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 22286 1726882810.36349: when evaluation is False, skipping this task 22286 1726882810.36351: _execute() done 22286 1726882810.36356: dumping result to json 22286 1726882810.36361: done dumping result, returning 22286 1726882810.36368: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affe814-3a2d-a75d-4836-000000000073] 22286 1726882810.36374: sending task result for task 0affe814-3a2d-a75d-4836-000000000073 22286 1726882810.36468: done sending task result for task 0affe814-3a2d-a75d-4836-000000000073 22286 1726882810.36471: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 22286 1726882810.36529: no more pending results, returning what we have 22286 1726882810.36533: results queue empty 22286 1726882810.36536: checking for any_errors_fatal 22286 1726882810.36545: done checking for any_errors_fatal 22286 1726882810.36545: checking for max_fail_percentage 22286 1726882810.36548: done checking for max_fail_percentage 22286 1726882810.36549: checking to see if all hosts have failed and the running result is not ok 22286 1726882810.36550: done checking to see if all hosts have failed 22286 1726882810.36551: getting the remaining hosts for this loop 22286 1726882810.36553: done getting the remaining hosts for this loop 22286 1726882810.36557: getting the next task for host managed_node3 22286 1726882810.36564: done getting next task for host managed_node3 22286 1726882810.36568: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 22286 1726882810.36572: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22286 1726882810.36593: getting variables 22286 1726882810.36595: in VariableManager get_vars() 22286 1726882810.36639: Calling all_inventory to load vars for managed_node3 22286 1726882810.36642: Calling groups_inventory to load vars for managed_node3 22286 1726882810.36645: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882810.36655: Calling all_plugins_play to load vars for managed_node3 22286 1726882810.36658: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882810.36662: Calling groups_plugins_play to load vars for managed_node3 22286 1726882810.37867: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882810.39425: done with get_vars() 22286 1726882810.39450: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 22286 1726882810.39510: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:40:10 -0400 (0:00:00.084) 0:00:33.788 ****** 22286 1726882810.39540: entering _queue_task() for managed_node3/yum 22286 1726882810.39777: worker is 1 (out of 1 available) 22286 1726882810.39791: exiting _queue_task() for managed_node3/yum 22286 1726882810.39804: done queuing things up, now waiting for results queue to drain 22286 1726882810.39806: waiting for pending results... 22286 1726882810.39998: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 22286 1726882810.40109: in run() - task 0affe814-3a2d-a75d-4836-000000000074 22286 1726882810.40121: variable 'ansible_search_path' from source: unknown 22286 1726882810.40125: variable 'ansible_search_path' from source: unknown 22286 1726882810.40161: calling self._execute() 22286 1726882810.40241: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882810.40247: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882810.40258: variable 'omit' from source: magic vars 22286 1726882810.40568: variable 'ansible_distribution_major_version' from source: facts 22286 1726882810.40589: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882810.40940: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22286 1726882810.43373: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22286 1726882810.43425: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22286 1726882810.43460: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22286 1726882810.43493: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22286 1726882810.43515: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22286 1726882810.43588: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22286 1726882810.43612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22286 1726882810.43633: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882810.43668: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22286 1726882810.43686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22286 1726882810.43760: variable 'ansible_distribution_major_version' from source: facts 22286 1726882810.43776: Evaluated conditional (ansible_distribution_major_version | int < 8): False 22286 1726882810.43780: when evaluation is False, skipping this task 22286 1726882810.43783: _execute() done 22286 1726882810.43786: dumping result to json 22286 1726882810.43790: done dumping result, returning 22286 1726882810.43800: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affe814-3a2d-a75d-4836-000000000074] 22286 1726882810.43805: sending task result for task 0affe814-3a2d-a75d-4836-000000000074 22286 1726882810.43897: done sending task result for task 0affe814-3a2d-a75d-4836-000000000074 22286 1726882810.43901: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 22286 1726882810.43961: no more pending results, returning what we have 22286 1726882810.43964: results queue empty 22286 1726882810.43965: checking for any_errors_fatal 22286 1726882810.43971: done checking for any_errors_fatal 22286 1726882810.43972: checking for max_fail_percentage 22286 1726882810.43974: done checking for max_fail_percentage 22286 1726882810.43976: checking to see if all hosts have failed and the running result is not ok 22286 1726882810.43977: done checking to see if all hosts have failed 22286 1726882810.43978: getting the remaining hosts for this loop 22286 1726882810.43980: done getting the remaining hosts for this loop 22286 1726882810.43984: getting the next task for host managed_node3 22286 1726882810.43992: done getting next task for host managed_node3 22286 1726882810.43996: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 22286 1726882810.44000: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22286 1726882810.44020: getting variables 22286 1726882810.44022: in VariableManager get_vars() 22286 1726882810.44069: Calling all_inventory to load vars for managed_node3 22286 1726882810.44072: Calling groups_inventory to load vars for managed_node3 22286 1726882810.44075: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882810.44085: Calling all_plugins_play to load vars for managed_node3 22286 1726882810.44089: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882810.44092: Calling groups_plugins_play to load vars for managed_node3 22286 1726882810.45419: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882810.46969: done with get_vars() 22286 1726882810.46992: done getting variables 22286 1726882810.47048: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:40:10 -0400 (0:00:00.075) 0:00:33.864 ****** 22286 1726882810.47075: entering _queue_task() for managed_node3/fail 22286 1726882810.47309: worker is 1 (out of 1 available) 22286 1726882810.47322: exiting _queue_task() for managed_node3/fail 22286 1726882810.47337: done queuing things up, now waiting for results queue to drain 22286 1726882810.47339: waiting for pending results... 22286 1726882810.47526: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 22286 1726882810.47633: in run() - task 0affe814-3a2d-a75d-4836-000000000075 22286 1726882810.47647: variable 'ansible_search_path' from source: unknown 22286 1726882810.47651: variable 'ansible_search_path' from source: unknown 22286 1726882810.47688: calling self._execute() 22286 1726882810.47767: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882810.47772: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882810.47789: variable 'omit' from source: magic vars 22286 1726882810.48091: variable 'ansible_distribution_major_version' from source: facts 22286 1726882810.48102: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882810.48202: variable '__network_wireless_connections_defined' from source: role '' defaults 22286 1726882810.48371: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22286 1726882810.50079: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22286 1726882810.50136: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22286 1726882810.50166: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22286 1726882810.50201: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22286 1726882810.50224: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22286 1726882810.50296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22286 1726882810.50330: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22286 1726882810.50354: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882810.50389: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22286 1726882810.50402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22286 1726882810.50448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22286 1726882810.50468: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22286 1726882810.50490: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882810.50526: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22286 1726882810.50542: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22286 1726882810.50578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22286 1726882810.50597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22286 1726882810.50618: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882810.50654: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22286 1726882810.50666: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22286 1726882810.50807: variable 'network_connections' from source: task vars 22286 1726882810.50818: variable 'interface' from source: play vars 22286 1726882810.50877: variable 'interface' from source: play vars 22286 1726882810.50936: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22286 1726882810.51065: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22286 1726882810.51097: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22286 1726882810.51124: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22286 1726882810.51151: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22286 1726882810.51189: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22286 1726882810.51208: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22286 1726882810.51229: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882810.51253: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22286 1726882810.51297: variable '__network_team_connections_defined' from source: role '' defaults 22286 1726882810.51496: variable 'network_connections' from source: task vars 22286 1726882810.51500: variable 'interface' from source: play vars 22286 1726882810.51553: variable 'interface' from source: play vars 22286 1726882810.51573: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 22286 1726882810.51579: when evaluation is False, skipping this task 22286 1726882810.51582: _execute() done 22286 1726882810.51585: dumping result to json 22286 1726882810.51587: done dumping result, returning 22286 1726882810.51595: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affe814-3a2d-a75d-4836-000000000075] 22286 1726882810.51604: sending task result for task 0affe814-3a2d-a75d-4836-000000000075 22286 1726882810.51699: done sending task result for task 0affe814-3a2d-a75d-4836-000000000075 22286 1726882810.51702: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 22286 1726882810.51781: no more pending results, returning what we have 22286 1726882810.51784: results queue empty 22286 1726882810.51785: checking for any_errors_fatal 22286 1726882810.51792: done checking for any_errors_fatal 22286 1726882810.51793: checking for max_fail_percentage 22286 1726882810.51795: done checking for max_fail_percentage 22286 1726882810.51796: checking to see if all hosts have failed and the running result is not ok 22286 1726882810.51797: done checking to see if all hosts have failed 22286 1726882810.51798: getting the remaining hosts for this loop 22286 1726882810.51800: done getting the remaining hosts for this loop 22286 1726882810.51804: getting the next task for host managed_node3 22286 1726882810.51810: done getting next task for host managed_node3 22286 1726882810.51813: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 22286 1726882810.51818: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22286 1726882810.51840: getting variables 22286 1726882810.51842: in VariableManager get_vars() 22286 1726882810.51887: Calling all_inventory to load vars for managed_node3 22286 1726882810.51891: Calling groups_inventory to load vars for managed_node3 22286 1726882810.51894: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882810.51903: Calling all_plugins_play to load vars for managed_node3 22286 1726882810.51907: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882810.51910: Calling groups_plugins_play to load vars for managed_node3 22286 1726882810.53115: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882810.54749: done with get_vars() 22286 1726882810.54770: done getting variables 22286 1726882810.54817: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:40:10 -0400 (0:00:00.077) 0:00:33.941 ****** 22286 1726882810.54844: entering _queue_task() for managed_node3/package 22286 1726882810.55068: worker is 1 (out of 1 available) 22286 1726882810.55084: exiting _queue_task() for managed_node3/package 22286 1726882810.55098: done queuing things up, now waiting for results queue to drain 22286 1726882810.55099: waiting for pending results... 22286 1726882810.55282: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 22286 1726882810.55388: in run() - task 0affe814-3a2d-a75d-4836-000000000076 22286 1726882810.55400: variable 'ansible_search_path' from source: unknown 22286 1726882810.55404: variable 'ansible_search_path' from source: unknown 22286 1726882810.55444: calling self._execute() 22286 1726882810.55520: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882810.55527: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882810.55541: variable 'omit' from source: magic vars 22286 1726882810.55838: variable 'ansible_distribution_major_version' from source: facts 22286 1726882810.55854: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882810.56017: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22286 1726882810.56245: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22286 1726882810.56305: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22286 1726882810.56310: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22286 1726882810.56439: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22286 1726882810.56506: variable 'network_packages' from source: role '' defaults 22286 1726882810.56663: variable '__network_provider_setup' from source: role '' defaults 22286 1726882810.56682: variable '__network_service_name_default_nm' from source: role '' defaults 22286 1726882810.56765: variable '__network_service_name_default_nm' from source: role '' defaults 22286 1726882810.56779: variable '__network_packages_default_nm' from source: role '' defaults 22286 1726882810.56860: variable '__network_packages_default_nm' from source: role '' defaults 22286 1726882810.57148: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22286 1726882810.61540: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22286 1726882810.61545: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22286 1726882810.61941: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22286 1726882810.61956: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22286 1726882810.61960: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22286 1726882810.62109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22286 1726882810.62246: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22286 1726882810.62366: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882810.62550: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22286 1726882810.62574: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22286 1726882810.62850: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22286 1726882810.62863: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22286 1726882810.63239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882810.63243: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22286 1726882810.63246: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22286 1726882810.63839: variable '__network_packages_default_gobject_packages' from source: role '' defaults 22286 1726882810.64227: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22286 1726882810.64267: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22286 1726882810.64321: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882810.64507: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22286 1726882810.64681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22286 1726882810.64797: variable 'ansible_python' from source: facts 22286 1726882810.64870: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 22286 1726882810.65318: variable '__network_wpa_supplicant_required' from source: role '' defaults 22286 1726882810.65440: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 22286 1726882810.65969: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22286 1726882810.65973: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22286 1726882810.65976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882810.66056: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22286 1726882810.66101: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22286 1726882810.66256: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22286 1726882810.66300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22286 1726882810.66525: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882810.66529: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22286 1726882810.66840: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22286 1726882810.66936: variable 'network_connections' from source: task vars 22286 1726882810.67024: variable 'interface' from source: play vars 22286 1726882810.67347: variable 'interface' from source: play vars 22286 1726882810.67495: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22286 1726882810.67668: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22286 1726882810.67712: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882810.67982: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22286 1726882810.68059: variable '__network_wireless_connections_defined' from source: role '' defaults 22286 1726882810.68896: variable 'network_connections' from source: task vars 22286 1726882810.68968: variable 'interface' from source: play vars 22286 1726882810.69247: variable 'interface' from source: play vars 22286 1726882810.69502: variable '__network_packages_default_wireless' from source: role '' defaults 22286 1726882810.69530: variable '__network_wireless_connections_defined' from source: role '' defaults 22286 1726882810.70515: variable 'network_connections' from source: task vars 22286 1726882810.70613: variable 'interface' from source: play vars 22286 1726882810.70710: variable 'interface' from source: play vars 22286 1726882810.70827: variable '__network_packages_default_team' from source: role '' defaults 22286 1726882810.71039: variable '__network_team_connections_defined' from source: role '' defaults 22286 1726882810.71900: variable 'network_connections' from source: task vars 22286 1726882810.72152: variable 'interface' from source: play vars 22286 1726882810.72156: variable 'interface' from source: play vars 22286 1726882810.72296: variable '__network_service_name_default_initscripts' from source: role '' defaults 22286 1726882810.72491: variable '__network_service_name_default_initscripts' from source: role '' defaults 22286 1726882810.72505: variable '__network_packages_default_initscripts' from source: role '' defaults 22286 1726882810.72851: variable '__network_packages_default_initscripts' from source: role '' defaults 22286 1726882810.73386: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 22286 1726882810.74760: variable 'network_connections' from source: task vars 22286 1726882810.74772: variable 'interface' from source: play vars 22286 1726882810.74921: variable 'interface' from source: play vars 22286 1726882810.75006: variable 'ansible_distribution' from source: facts 22286 1726882810.75018: variable '__network_rh_distros' from source: role '' defaults 22286 1726882810.75031: variable 'ansible_distribution_major_version' from source: facts 22286 1726882810.75080: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 22286 1726882810.75541: variable 'ansible_distribution' from source: facts 22286 1726882810.75615: variable '__network_rh_distros' from source: role '' defaults 22286 1726882810.75627: variable 'ansible_distribution_major_version' from source: facts 22286 1726882810.75642: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 22286 1726882810.76088: variable 'ansible_distribution' from source: facts 22286 1726882810.76099: variable '__network_rh_distros' from source: role '' defaults 22286 1726882810.76161: variable 'ansible_distribution_major_version' from source: facts 22286 1726882810.76208: variable 'network_provider' from source: set_fact 22286 1726882810.76439: variable 'ansible_facts' from source: unknown 22286 1726882810.78372: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 22286 1726882810.78382: when evaluation is False, skipping this task 22286 1726882810.78391: _execute() done 22286 1726882810.78399: dumping result to json 22286 1726882810.78408: done dumping result, returning 22286 1726882810.78485: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [0affe814-3a2d-a75d-4836-000000000076] 22286 1726882810.78497: sending task result for task 0affe814-3a2d-a75d-4836-000000000076 skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 22286 1726882810.78809: no more pending results, returning what we have 22286 1726882810.78817: results queue empty 22286 1726882810.78819: checking for any_errors_fatal 22286 1726882810.78833: done checking for any_errors_fatal 22286 1726882810.78836: checking for max_fail_percentage 22286 1726882810.78842: done checking for max_fail_percentage 22286 1726882810.78843: checking to see if all hosts have failed and the running result is not ok 22286 1726882810.78845: done checking to see if all hosts have failed 22286 1726882810.78845: getting the remaining hosts for this loop 22286 1726882810.78847: done getting the remaining hosts for this loop 22286 1726882810.78852: getting the next task for host managed_node3 22286 1726882810.78861: done getting next task for host managed_node3 22286 1726882810.78867: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 22286 1726882810.78871: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22286 1726882810.78893: getting variables 22286 1726882810.78895: in VariableManager get_vars() 22286 1726882810.78971: Calling all_inventory to load vars for managed_node3 22286 1726882810.78976: Calling groups_inventory to load vars for managed_node3 22286 1726882810.78979: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882810.78991: Calling all_plugins_play to load vars for managed_node3 22286 1726882810.78995: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882810.79008: Calling groups_plugins_play to load vars for managed_node3 22286 1726882810.80220: done sending task result for task 0affe814-3a2d-a75d-4836-000000000076 22286 1726882810.80225: WORKER PROCESS EXITING 22286 1726882810.82354: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882810.87433: done with get_vars() 22286 1726882810.87701: done getting variables 22286 1726882810.87780: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:40:10 -0400 (0:00:00.330) 0:00:34.272 ****** 22286 1726882810.87867: entering _queue_task() for managed_node3/package 22286 1726882810.88827: worker is 1 (out of 1 available) 22286 1726882810.88845: exiting _queue_task() for managed_node3/package 22286 1726882810.88858: done queuing things up, now waiting for results queue to drain 22286 1726882810.88859: waiting for pending results... 22286 1726882810.89330: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 22286 1726882810.89741: in run() - task 0affe814-3a2d-a75d-4836-000000000077 22286 1726882810.89765: variable 'ansible_search_path' from source: unknown 22286 1726882810.89849: variable 'ansible_search_path' from source: unknown 22286 1726882810.89914: calling self._execute() 22286 1726882810.90203: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882810.90222: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882810.90245: variable 'omit' from source: magic vars 22286 1726882810.91339: variable 'ansible_distribution_major_version' from source: facts 22286 1726882810.91344: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882810.91699: variable 'network_state' from source: role '' defaults 22286 1726882810.91702: Evaluated conditional (network_state != {}): False 22286 1726882810.91708: when evaluation is False, skipping this task 22286 1726882810.91716: _execute() done 22286 1726882810.91724: dumping result to json 22286 1726882810.91731: done dumping result, returning 22286 1726882810.91746: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affe814-3a2d-a75d-4836-000000000077] 22286 1726882810.91917: sending task result for task 0affe814-3a2d-a75d-4836-000000000077 22286 1726882810.91997: done sending task result for task 0affe814-3a2d-a75d-4836-000000000077 22286 1726882810.92000: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 22286 1726882810.92100: no more pending results, returning what we have 22286 1726882810.92105: results queue empty 22286 1726882810.92106: checking for any_errors_fatal 22286 1726882810.92114: done checking for any_errors_fatal 22286 1726882810.92115: checking for max_fail_percentage 22286 1726882810.92117: done checking for max_fail_percentage 22286 1726882810.92119: checking to see if all hosts have failed and the running result is not ok 22286 1726882810.92120: done checking to see if all hosts have failed 22286 1726882810.92121: getting the remaining hosts for this loop 22286 1726882810.92123: done getting the remaining hosts for this loop 22286 1726882810.92132: getting the next task for host managed_node3 22286 1726882810.92141: done getting next task for host managed_node3 22286 1726882810.92146: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 22286 1726882810.92151: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22286 1726882810.92178: getting variables 22286 1726882810.92180: in VariableManager get_vars() 22286 1726882810.92230: Calling all_inventory to load vars for managed_node3 22286 1726882810.92535: Calling groups_inventory to load vars for managed_node3 22286 1726882810.92540: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882810.92554: Calling all_plugins_play to load vars for managed_node3 22286 1726882810.92558: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882810.92563: Calling groups_plugins_play to load vars for managed_node3 22286 1726882811.09349: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882811.15332: done with get_vars() 22286 1726882811.15387: done getting variables 22286 1726882811.15659: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:40:11 -0400 (0:00:00.278) 0:00:34.550 ****** 22286 1726882811.15699: entering _queue_task() for managed_node3/package 22286 1726882811.16484: worker is 1 (out of 1 available) 22286 1726882811.16496: exiting _queue_task() for managed_node3/package 22286 1726882811.16509: done queuing things up, now waiting for results queue to drain 22286 1726882811.16511: waiting for pending results... 22286 1726882811.17059: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 22286 1726882811.17224: in run() - task 0affe814-3a2d-a75d-4836-000000000078 22286 1726882811.17282: variable 'ansible_search_path' from source: unknown 22286 1726882811.17372: variable 'ansible_search_path' from source: unknown 22286 1726882811.17401: calling self._execute() 22286 1726882811.17810: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882811.17814: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882811.17817: variable 'omit' from source: magic vars 22286 1726882811.18786: variable 'ansible_distribution_major_version' from source: facts 22286 1726882811.18894: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882811.18972: variable 'network_state' from source: role '' defaults 22286 1726882811.19126: Evaluated conditional (network_state != {}): False 22286 1726882811.19136: when evaluation is False, skipping this task 22286 1726882811.19145: _execute() done 22286 1726882811.19153: dumping result to json 22286 1726882811.19161: done dumping result, returning 22286 1726882811.19172: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affe814-3a2d-a75d-4836-000000000078] 22286 1726882811.19184: sending task result for task 0affe814-3a2d-a75d-4836-000000000078 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 22286 1726882811.19361: no more pending results, returning what we have 22286 1726882811.19366: results queue empty 22286 1726882811.19367: checking for any_errors_fatal 22286 1726882811.19374: done checking for any_errors_fatal 22286 1726882811.19378: checking for max_fail_percentage 22286 1726882811.19380: done checking for max_fail_percentage 22286 1726882811.19381: checking to see if all hosts have failed and the running result is not ok 22286 1726882811.19382: done checking to see if all hosts have failed 22286 1726882811.19383: getting the remaining hosts for this loop 22286 1726882811.19386: done getting the remaining hosts for this loop 22286 1726882811.19390: getting the next task for host managed_node3 22286 1726882811.19398: done getting next task for host managed_node3 22286 1726882811.19402: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 22286 1726882811.19406: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22286 1726882811.19430: getting variables 22286 1726882811.19431: in VariableManager get_vars() 22286 1726882811.19672: Calling all_inventory to load vars for managed_node3 22286 1726882811.19677: Calling groups_inventory to load vars for managed_node3 22286 1726882811.19680: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882811.19692: Calling all_plugins_play to load vars for managed_node3 22286 1726882811.19696: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882811.19699: Calling groups_plugins_play to load vars for managed_node3 22286 1726882811.20351: done sending task result for task 0affe814-3a2d-a75d-4836-000000000078 22286 1726882811.20355: WORKER PROCESS EXITING 22286 1726882811.24060: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882811.30487: done with get_vars() 22286 1726882811.30524: done getting variables 22286 1726882811.30602: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:40:11 -0400 (0:00:00.151) 0:00:34.701 ****** 22286 1726882811.30852: entering _queue_task() for managed_node3/service 22286 1726882811.31870: worker is 1 (out of 1 available) 22286 1726882811.31880: exiting _queue_task() for managed_node3/service 22286 1726882811.31892: done queuing things up, now waiting for results queue to drain 22286 1726882811.31894: waiting for pending results... 22286 1726882811.32104: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 22286 1726882811.32511: in run() - task 0affe814-3a2d-a75d-4836-000000000079 22286 1726882811.32562: variable 'ansible_search_path' from source: unknown 22286 1726882811.32577: variable 'ansible_search_path' from source: unknown 22286 1726882811.32623: calling self._execute() 22286 1726882811.32974: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882811.32990: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882811.33231: variable 'omit' from source: magic vars 22286 1726882811.34404: variable 'ansible_distribution_major_version' from source: facts 22286 1726882811.34440: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882811.34845: variable '__network_wireless_connections_defined' from source: role '' defaults 22286 1726882811.35520: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22286 1726882811.42094: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22286 1726882811.42196: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22286 1726882811.42639: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22286 1726882811.42691: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22286 1726882811.42738: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22286 1726882811.42839: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22286 1726882811.43440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22286 1726882811.43443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882811.43447: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22286 1726882811.43449: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22286 1726882811.43452: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22286 1726882811.44039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22286 1726882811.44042: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882811.44046: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22286 1726882811.44048: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22286 1726882811.44051: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22286 1726882811.44251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22286 1726882811.44291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882811.44346: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22286 1726882811.44369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22286 1726882811.44817: variable 'network_connections' from source: task vars 22286 1726882811.45240: variable 'interface' from source: play vars 22286 1726882811.45243: variable 'interface' from source: play vars 22286 1726882811.45246: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22286 1726882811.45840: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22286 1726882811.45904: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22286 1726882811.45949: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22286 1726882811.45990: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22286 1726882811.46439: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22286 1726882811.46442: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22286 1726882811.46445: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882811.46448: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22286 1726882811.46450: variable '__network_team_connections_defined' from source: role '' defaults 22286 1726882811.47246: variable 'network_connections' from source: task vars 22286 1726882811.47258: variable 'interface' from source: play vars 22286 1726882811.47339: variable 'interface' from source: play vars 22286 1726882811.47380: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 22286 1726882811.47390: when evaluation is False, skipping this task 22286 1726882811.47842: _execute() done 22286 1726882811.47845: dumping result to json 22286 1726882811.47848: done dumping result, returning 22286 1726882811.47850: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affe814-3a2d-a75d-4836-000000000079] 22286 1726882811.47852: sending task result for task 0affe814-3a2d-a75d-4836-000000000079 22286 1726882811.47931: done sending task result for task 0affe814-3a2d-a75d-4836-000000000079 22286 1726882811.47986: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 22286 1726882811.48000: no more pending results, returning what we have 22286 1726882811.48003: results queue empty 22286 1726882811.48004: checking for any_errors_fatal 22286 1726882811.48011: done checking for any_errors_fatal 22286 1726882811.48012: checking for max_fail_percentage 22286 1726882811.48014: done checking for max_fail_percentage 22286 1726882811.48015: checking to see if all hosts have failed and the running result is not ok 22286 1726882811.48016: done checking to see if all hosts have failed 22286 1726882811.48017: getting the remaining hosts for this loop 22286 1726882811.48019: done getting the remaining hosts for this loop 22286 1726882811.48022: getting the next task for host managed_node3 22286 1726882811.48029: done getting next task for host managed_node3 22286 1726882811.48033: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 22286 1726882811.48038: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22286 1726882811.48056: getting variables 22286 1726882811.48058: in VariableManager get_vars() 22286 1726882811.48100: Calling all_inventory to load vars for managed_node3 22286 1726882811.48103: Calling groups_inventory to load vars for managed_node3 22286 1726882811.48106: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882811.48115: Calling all_plugins_play to load vars for managed_node3 22286 1726882811.48118: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882811.48122: Calling groups_plugins_play to load vars for managed_node3 22286 1726882811.53342: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882811.59902: done with get_vars() 22286 1726882811.60100: done getting variables 22286 1726882811.60173: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:40:11 -0400 (0:00:00.294) 0:00:34.996 ****** 22286 1726882811.60333: entering _queue_task() for managed_node3/service 22286 1726882811.61365: worker is 1 (out of 1 available) 22286 1726882811.61379: exiting _queue_task() for managed_node3/service 22286 1726882811.61393: done queuing things up, now waiting for results queue to drain 22286 1726882811.61395: waiting for pending results... 22286 1726882811.61875: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 22286 1726882811.62250: in run() - task 0affe814-3a2d-a75d-4836-00000000007a 22286 1726882811.62274: variable 'ansible_search_path' from source: unknown 22286 1726882811.62286: variable 'ansible_search_path' from source: unknown 22286 1726882811.62335: calling self._execute() 22286 1726882811.62564: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882811.62651: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882811.62673: variable 'omit' from source: magic vars 22286 1726882811.63804: variable 'ansible_distribution_major_version' from source: facts 22286 1726882811.63823: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882811.64151: variable 'network_provider' from source: set_fact 22286 1726882811.64540: variable 'network_state' from source: role '' defaults 22286 1726882811.64544: Evaluated conditional (network_provider == "nm" or network_state != {}): True 22286 1726882811.64547: variable 'omit' from source: magic vars 22286 1726882811.64578: variable 'omit' from source: magic vars 22286 1726882811.64620: variable 'network_service_name' from source: role '' defaults 22286 1726882811.64708: variable 'network_service_name' from source: role '' defaults 22286 1726882811.65122: variable '__network_provider_setup' from source: role '' defaults 22286 1726882811.65176: variable '__network_service_name_default_nm' from source: role '' defaults 22286 1726882811.65259: variable '__network_service_name_default_nm' from source: role '' defaults 22286 1726882811.65388: variable '__network_packages_default_nm' from source: role '' defaults 22286 1726882811.65470: variable '__network_packages_default_nm' from source: role '' defaults 22286 1726882811.66339: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22286 1726882811.70930: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22286 1726882811.72025: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22286 1726882811.72341: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22286 1726882811.72344: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22286 1726882811.72371: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22286 1726882811.72741: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22286 1726882811.72746: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22286 1726882811.72749: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882811.72809: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22286 1726882811.72827: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22286 1726882811.72999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22286 1726882811.73036: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22286 1726882811.73073: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882811.73191: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22286 1726882811.73439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22286 1726882811.73979: variable '__network_packages_default_gobject_packages' from source: role '' defaults 22286 1726882811.74138: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22286 1726882811.74372: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22286 1726882811.74408: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882811.74465: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22286 1726882811.74488: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22286 1726882811.74807: variable 'ansible_python' from source: facts 22286 1726882811.74841: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 22286 1726882811.75339: variable '__network_wpa_supplicant_required' from source: role '' defaults 22286 1726882811.75343: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 22286 1726882811.75607: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22286 1726882811.75646: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22286 1726882811.75683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882811.76040: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22286 1726882811.76044: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22286 1726882811.76046: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22286 1726882811.76070: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22286 1726882811.76107: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882811.76165: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22286 1726882811.76260: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22286 1726882811.76841: variable 'network_connections' from source: task vars 22286 1726882811.76844: variable 'interface' from source: play vars 22286 1726882811.76847: variable 'interface' from source: play vars 22286 1726882811.77082: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22286 1726882811.77605: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22286 1726882811.77683: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22286 1726882811.77733: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22286 1726882811.77781: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22286 1726882811.78013: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22286 1726882811.78057: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22286 1726882811.78103: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882811.78441: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22286 1726882811.78445: variable '__network_wireless_connections_defined' from source: role '' defaults 22286 1726882811.79014: variable 'network_connections' from source: task vars 22286 1726882811.79153: variable 'interface' from source: play vars 22286 1726882811.79345: variable 'interface' from source: play vars 22286 1726882811.79390: variable '__network_packages_default_wireless' from source: role '' defaults 22286 1726882811.79696: variable '__network_wireless_connections_defined' from source: role '' defaults 22286 1726882811.80490: variable 'network_connections' from source: task vars 22286 1726882811.80502: variable 'interface' from source: play vars 22286 1726882811.80590: variable 'interface' from source: play vars 22286 1726882811.80627: variable '__network_packages_default_team' from source: role '' defaults 22286 1726882811.80852: variable '__network_team_connections_defined' from source: role '' defaults 22286 1726882811.81316: variable 'network_connections' from source: task vars 22286 1726882811.81330: variable 'interface' from source: play vars 22286 1726882811.81439: variable 'interface' from source: play vars 22286 1726882811.81536: variable '__network_service_name_default_initscripts' from source: role '' defaults 22286 1726882811.81625: variable '__network_service_name_default_initscripts' from source: role '' defaults 22286 1726882811.81643: variable '__network_packages_default_initscripts' from source: role '' defaults 22286 1726882811.81730: variable '__network_packages_default_initscripts' from source: role '' defaults 22286 1726882811.82066: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 22286 1726882811.82784: variable 'network_connections' from source: task vars 22286 1726882811.82796: variable 'interface' from source: play vars 22286 1726882811.82886: variable 'interface' from source: play vars 22286 1726882811.82902: variable 'ansible_distribution' from source: facts 22286 1726882811.82911: variable '__network_rh_distros' from source: role '' defaults 22286 1726882811.82922: variable 'ansible_distribution_major_version' from source: facts 22286 1726882811.82945: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 22286 1726882811.83228: variable 'ansible_distribution' from source: facts 22286 1726882811.83340: variable '__network_rh_distros' from source: role '' defaults 22286 1726882811.83359: variable 'ansible_distribution_major_version' from source: facts 22286 1726882811.83389: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 22286 1726882811.83785: variable 'ansible_distribution' from source: facts 22286 1726882811.83796: variable '__network_rh_distros' from source: role '' defaults 22286 1726882811.83808: variable 'ansible_distribution_major_version' from source: facts 22286 1726882811.83859: variable 'network_provider' from source: set_fact 22286 1726882811.83900: variable 'omit' from source: magic vars 22286 1726882811.83942: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22286 1726882811.83979: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22286 1726882811.84005: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22286 1726882811.84026: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882811.84043: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882811.84075: variable 'inventory_hostname' from source: host vars for 'managed_node3' 22286 1726882811.84081: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882811.84084: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882811.84342: Set connection var ansible_shell_executable to /bin/sh 22286 1726882811.84346: Set connection var ansible_module_compression to ZIP_DEFLATED 22286 1726882811.84349: Set connection var ansible_connection to ssh 22286 1726882811.84351: Set connection var ansible_shell_type to sh 22286 1726882811.84353: Set connection var ansible_timeout to 10 22286 1726882811.84355: Set connection var ansible_pipelining to False 22286 1726882811.84358: variable 'ansible_shell_executable' from source: unknown 22286 1726882811.84360: variable 'ansible_connection' from source: unknown 22286 1726882811.84362: variable 'ansible_module_compression' from source: unknown 22286 1726882811.84364: variable 'ansible_shell_type' from source: unknown 22286 1726882811.84366: variable 'ansible_shell_executable' from source: unknown 22286 1726882811.84368: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882811.84370: variable 'ansible_pipelining' from source: unknown 22286 1726882811.84372: variable 'ansible_timeout' from source: unknown 22286 1726882811.84377: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882811.84464: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22286 1726882811.84479: variable 'omit' from source: magic vars 22286 1726882811.84487: starting attempt loop 22286 1726882811.84490: running the handler 22286 1726882811.84695: variable 'ansible_facts' from source: unknown 22286 1726882811.86213: _low_level_execute_command(): starting 22286 1726882811.86220: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22286 1726882811.87082: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882811.87152: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882811.87212: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882811.87216: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882811.87242: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882811.87417: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882811.89483: stdout chunk (state=3): >>>/root <<< 22286 1726882811.89551: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882811.89593: stderr chunk (state=3): >>><<< 22286 1726882811.89606: stdout chunk (state=3): >>><<< 22286 1726882811.89699: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882811.89719: _low_level_execute_command(): starting 22286 1726882811.89731: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882811.8970666-23566-1479367228386 `" && echo ansible-tmp-1726882811.8970666-23566-1479367228386="` echo /root/.ansible/tmp/ansible-tmp-1726882811.8970666-23566-1479367228386 `" ) && sleep 0' 22286 1726882811.91018: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882811.91033: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882811.91084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882811.91194: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882811.91302: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882811.91409: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882811.91461: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882811.93691: stdout chunk (state=3): >>>ansible-tmp-1726882811.8970666-23566-1479367228386=/root/.ansible/tmp/ansible-tmp-1726882811.8970666-23566-1479367228386 <<< 22286 1726882811.93783: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882811.93919: stderr chunk (state=3): >>><<< 22286 1726882811.93949: stdout chunk (state=3): >>><<< 22286 1726882811.94239: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882811.8970666-23566-1479367228386=/root/.ansible/tmp/ansible-tmp-1726882811.8970666-23566-1479367228386 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882811.94247: variable 'ansible_module_compression' from source: unknown 22286 1726882811.94249: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22286nvkga28q/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 22286 1726882811.94310: variable 'ansible_facts' from source: unknown 22286 1726882811.94757: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882811.8970666-23566-1479367228386/AnsiballZ_systemd.py 22286 1726882811.95283: Sending initial data 22286 1726882811.95287: Sent initial data (154 bytes) 22286 1726882811.96581: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882811.96709: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882811.96813: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882811.96950: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882811.97097: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882811.98909: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22286 1726882811.99074: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22286 1726882811.99184: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22286nvkga28q/tmpsjdugma_ /root/.ansible/tmp/ansible-tmp-1726882811.8970666-23566-1479367228386/AnsiballZ_systemd.py <<< 22286 1726882811.99187: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882811.8970666-23566-1479367228386/AnsiballZ_systemd.py" <<< 22286 1726882811.99372: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-22286nvkga28q/tmpsjdugma_" to remote "/root/.ansible/tmp/ansible-tmp-1726882811.8970666-23566-1479367228386/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882811.8970666-23566-1479367228386/AnsiballZ_systemd.py" <<< 22286 1726882812.04414: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882812.04418: stdout chunk (state=3): >>><<< 22286 1726882812.04421: stderr chunk (state=3): >>><<< 22286 1726882812.04424: done transferring module to remote 22286 1726882812.04426: _low_level_execute_command(): starting 22286 1726882812.04436: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882811.8970666-23566-1479367228386/ /root/.ansible/tmp/ansible-tmp-1726882811.8970666-23566-1479367228386/AnsiballZ_systemd.py && sleep 0' 22286 1726882812.05662: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882812.05740: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882812.05805: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882812.05829: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882812.06355: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882812.08180: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882812.08184: stdout chunk (state=3): >>><<< 22286 1726882812.08187: stderr chunk (state=3): >>><<< 22286 1726882812.08189: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882812.08192: _low_level_execute_command(): starting 22286 1726882812.08195: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882811.8970666-23566-1479367228386/AnsiballZ_systemd.py && sleep 0' 22286 1726882812.09136: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882812.09451: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 22286 1726882812.09488: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882812.09606: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882812.42467: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "653", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:24:33 EDT", "ExecMainStartTimestampMonotonic": "18094121", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "653", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3421", "MemoryCurrent": "11960320", "MemoryAvailable": "infinity", "CPUUsageNSec": "1972220000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service shutdown.target multi-user.target network.target cloud-init.service", "After": "systemd-journald.socket dbus-broker.service dbus.socket basic.target network-pre.target system.slice sysinit.target cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:41 EDT", "StateChangeTimestampMonotonic": "505811565", "InactiveExitTimestamp": "Fri 2024-09-20 21:24:33 EDT", "InactiveExitTimestampMonotonic": "18094364", "ActiveEnterTimestamp": "Fri 2024-09-20 21:24:34 EDT", "ActiveEnterTimestampMonotonic": "18531095", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:24:33 EDT", "ConditionTimestampMonotonic": "18086405", "AssertTimestamp": "Fri 2024-09-20 21:24:33 EDT", "AssertTimestampMonotonic": "18086408", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "1c8adba7025b47b4adeb74e368331c9f", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 22286 1726882812.44463: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. <<< 22286 1726882812.44641: stderr chunk (state=3): >>><<< 22286 1726882812.44649: stdout chunk (state=3): >>><<< 22286 1726882812.44653: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "653", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:24:33 EDT", "ExecMainStartTimestampMonotonic": "18094121", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "653", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3421", "MemoryCurrent": "11960320", "MemoryAvailable": "infinity", "CPUUsageNSec": "1972220000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service shutdown.target multi-user.target network.target cloud-init.service", "After": "systemd-journald.socket dbus-broker.service dbus.socket basic.target network-pre.target system.slice sysinit.target cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:41 EDT", "StateChangeTimestampMonotonic": "505811565", "InactiveExitTimestamp": "Fri 2024-09-20 21:24:33 EDT", "InactiveExitTimestampMonotonic": "18094364", "ActiveEnterTimestamp": "Fri 2024-09-20 21:24:34 EDT", "ActiveEnterTimestampMonotonic": "18531095", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:24:33 EDT", "ConditionTimestampMonotonic": "18086405", "AssertTimestamp": "Fri 2024-09-20 21:24:33 EDT", "AssertTimestampMonotonic": "18086408", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "1c8adba7025b47b4adeb74e368331c9f", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. 22286 1726882812.45301: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882811.8970666-23566-1479367228386/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22286 1726882812.45410: _low_level_execute_command(): starting 22286 1726882812.45414: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882811.8970666-23566-1479367228386/ > /dev/null 2>&1 && sleep 0' 22286 1726882812.46663: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882812.46673: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882812.46685: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882812.46844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22286 1726882812.46853: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882812.46940: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882812.46955: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882812.46974: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882812.47112: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882812.49232: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882812.49239: stdout chunk (state=3): >>><<< 22286 1726882812.49245: stderr chunk (state=3): >>><<< 22286 1726882812.49306: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882812.49309: handler run complete 22286 1726882812.49381: attempt loop complete, returning result 22286 1726882812.49384: _execute() done 22286 1726882812.49387: dumping result to json 22286 1726882812.49410: done dumping result, returning 22286 1726882812.49422: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affe814-3a2d-a75d-4836-00000000007a] 22286 1726882812.49427: sending task result for task 0affe814-3a2d-a75d-4836-00000000007a 22286 1726882812.50395: done sending task result for task 0affe814-3a2d-a75d-4836-00000000007a ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 22286 1726882812.50465: no more pending results, returning what we have 22286 1726882812.50468: results queue empty 22286 1726882812.50469: checking for any_errors_fatal 22286 1726882812.50476: done checking for any_errors_fatal 22286 1726882812.50477: checking for max_fail_percentage 22286 1726882812.50479: done checking for max_fail_percentage 22286 1726882812.50480: checking to see if all hosts have failed and the running result is not ok 22286 1726882812.50481: done checking to see if all hosts have failed 22286 1726882812.50482: getting the remaining hosts for this loop 22286 1726882812.50485: done getting the remaining hosts for this loop 22286 1726882812.50490: getting the next task for host managed_node3 22286 1726882812.50497: done getting next task for host managed_node3 22286 1726882812.50501: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 22286 1726882812.50506: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22286 1726882812.50519: getting variables 22286 1726882812.50521: in VariableManager get_vars() 22286 1726882812.50566: Calling all_inventory to load vars for managed_node3 22286 1726882812.50569: Calling groups_inventory to load vars for managed_node3 22286 1726882812.50572: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882812.50584: Calling all_plugins_play to load vars for managed_node3 22286 1726882812.50588: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882812.50592: Calling groups_plugins_play to load vars for managed_node3 22286 1726882812.51350: WORKER PROCESS EXITING 22286 1726882812.53396: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882812.57517: done with get_vars() 22286 1726882812.57596: done getting variables 22286 1726882812.57674: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:40:12 -0400 (0:00:00.975) 0:00:35.971 ****** 22286 1726882812.57836: entering _queue_task() for managed_node3/service 22286 1726882812.58640: worker is 1 (out of 1 available) 22286 1726882812.58852: exiting _queue_task() for managed_node3/service 22286 1726882812.58865: done queuing things up, now waiting for results queue to drain 22286 1726882812.58867: waiting for pending results... 22286 1726882812.59407: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 22286 1726882812.59832: in run() - task 0affe814-3a2d-a75d-4836-00000000007b 22286 1726882812.59839: variable 'ansible_search_path' from source: unknown 22286 1726882812.59846: variable 'ansible_search_path' from source: unknown 22286 1726882812.59850: calling self._execute() 22286 1726882812.59977: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882812.59991: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882812.60010: variable 'omit' from source: magic vars 22286 1726882812.60498: variable 'ansible_distribution_major_version' from source: facts 22286 1726882812.60516: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882812.60682: variable 'network_provider' from source: set_fact 22286 1726882812.60703: Evaluated conditional (network_provider == "nm"): True 22286 1726882812.60914: variable '__network_wpa_supplicant_required' from source: role '' defaults 22286 1726882812.60981: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 22286 1726882812.61454: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22286 1726882812.64947: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22286 1726882812.65039: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22286 1726882812.65094: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22286 1726882812.65163: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22286 1726882812.65270: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22286 1726882812.65706: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22286 1726882812.65710: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22286 1726882812.65713: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882812.65839: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22286 1726882812.65892: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22286 1726882812.66141: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22286 1726882812.66149: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22286 1726882812.66173: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882812.66287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22286 1726882812.66309: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22286 1726882812.66378: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22286 1726882812.66418: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22286 1726882812.66461: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882812.66527: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22286 1726882812.66555: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22286 1726882812.66787: variable 'network_connections' from source: task vars 22286 1726882812.66814: variable 'interface' from source: play vars 22286 1726882812.66914: variable 'interface' from source: play vars 22286 1726882812.67026: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22286 1726882812.67254: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22286 1726882812.67306: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22286 1726882812.67355: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22286 1726882812.67393: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22286 1726882812.67464: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22286 1726882812.67498: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22286 1726882812.67538: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882812.67594: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22286 1726882812.67678: variable '__network_wireless_connections_defined' from source: role '' defaults 22286 1726882812.68121: variable 'network_connections' from source: task vars 22286 1726882812.68136: variable 'interface' from source: play vars 22286 1726882812.68250: variable 'interface' from source: play vars 22286 1726882812.68297: Evaluated conditional (__network_wpa_supplicant_required): False 22286 1726882812.68336: when evaluation is False, skipping this task 22286 1726882812.68341: _execute() done 22286 1726882812.68346: dumping result to json 22286 1726882812.68348: done dumping result, returning 22286 1726882812.68409: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affe814-3a2d-a75d-4836-00000000007b] 22286 1726882812.68424: sending task result for task 0affe814-3a2d-a75d-4836-00000000007b 22286 1726882812.68629: done sending task result for task 0affe814-3a2d-a75d-4836-00000000007b 22286 1726882812.68633: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 22286 1726882812.68727: no more pending results, returning what we have 22286 1726882812.68731: results queue empty 22286 1726882812.68732: checking for any_errors_fatal 22286 1726882812.68885: done checking for any_errors_fatal 22286 1726882812.68886: checking for max_fail_percentage 22286 1726882812.68891: done checking for max_fail_percentage 22286 1726882812.68892: checking to see if all hosts have failed and the running result is not ok 22286 1726882812.68894: done checking to see if all hosts have failed 22286 1726882812.68894: getting the remaining hosts for this loop 22286 1726882812.68897: done getting the remaining hosts for this loop 22286 1726882812.68902: getting the next task for host managed_node3 22286 1726882812.68913: done getting next task for host managed_node3 22286 1726882812.68919: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 22286 1726882812.68923: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22286 1726882812.69065: getting variables 22286 1726882812.69067: in VariableManager get_vars() 22286 1726882812.69123: Calling all_inventory to load vars for managed_node3 22286 1726882812.69127: Calling groups_inventory to load vars for managed_node3 22286 1726882812.69132: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882812.69147: Calling all_plugins_play to load vars for managed_node3 22286 1726882812.69270: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882812.69277: Calling groups_plugins_play to load vars for managed_node3 22286 1726882812.73573: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882812.77406: done with get_vars() 22286 1726882812.77445: done getting variables 22286 1726882812.77521: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:40:12 -0400 (0:00:00.197) 0:00:36.169 ****** 22286 1726882812.77568: entering _queue_task() for managed_node3/service 22286 1726882812.77972: worker is 1 (out of 1 available) 22286 1726882812.77986: exiting _queue_task() for managed_node3/service 22286 1726882812.78000: done queuing things up, now waiting for results queue to drain 22286 1726882812.78002: waiting for pending results... 22286 1726882812.78458: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 22286 1726882812.78556: in run() - task 0affe814-3a2d-a75d-4836-00000000007c 22286 1726882812.78560: variable 'ansible_search_path' from source: unknown 22286 1726882812.78563: variable 'ansible_search_path' from source: unknown 22286 1726882812.78567: calling self._execute() 22286 1726882812.78669: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882812.78683: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882812.78701: variable 'omit' from source: magic vars 22286 1726882812.79177: variable 'ansible_distribution_major_version' from source: facts 22286 1726882812.79202: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882812.79391: variable 'network_provider' from source: set_fact 22286 1726882812.79404: Evaluated conditional (network_provider == "initscripts"): False 22286 1726882812.79417: when evaluation is False, skipping this task 22286 1726882812.79431: _execute() done 22286 1726882812.79530: dumping result to json 22286 1726882812.79536: done dumping result, returning 22286 1726882812.79539: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [0affe814-3a2d-a75d-4836-00000000007c] 22286 1726882812.79542: sending task result for task 0affe814-3a2d-a75d-4836-00000000007c 22286 1726882812.79621: done sending task result for task 0affe814-3a2d-a75d-4836-00000000007c 22286 1726882812.79624: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 22286 1726882812.79680: no more pending results, returning what we have 22286 1726882812.79685: results queue empty 22286 1726882812.79687: checking for any_errors_fatal 22286 1726882812.79696: done checking for any_errors_fatal 22286 1726882812.79698: checking for max_fail_percentage 22286 1726882812.79700: done checking for max_fail_percentage 22286 1726882812.79702: checking to see if all hosts have failed and the running result is not ok 22286 1726882812.79703: done checking to see if all hosts have failed 22286 1726882812.79704: getting the remaining hosts for this loop 22286 1726882812.79706: done getting the remaining hosts for this loop 22286 1726882812.79711: getting the next task for host managed_node3 22286 1726882812.79718: done getting next task for host managed_node3 22286 1726882812.79723: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 22286 1726882812.79728: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22286 1726882812.79756: getting variables 22286 1726882812.79761: in VariableManager get_vars() 22286 1726882812.79814: Calling all_inventory to load vars for managed_node3 22286 1726882812.79818: Calling groups_inventory to load vars for managed_node3 22286 1726882812.79821: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882812.79939: Calling all_plugins_play to load vars for managed_node3 22286 1726882812.79948: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882812.79954: Calling groups_plugins_play to load vars for managed_node3 22286 1726882812.82487: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882812.85565: done with get_vars() 22286 1726882812.85601: done getting variables 22286 1726882812.85692: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:40:12 -0400 (0:00:00.081) 0:00:36.250 ****** 22286 1726882812.85742: entering _queue_task() for managed_node3/copy 22286 1726882812.86168: worker is 1 (out of 1 available) 22286 1726882812.86186: exiting _queue_task() for managed_node3/copy 22286 1726882812.86319: done queuing things up, now waiting for results queue to drain 22286 1726882812.86323: waiting for pending results... 22286 1726882812.86775: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 22286 1726882812.86809: in run() - task 0affe814-3a2d-a75d-4836-00000000007d 22286 1726882812.86855: variable 'ansible_search_path' from source: unknown 22286 1726882812.86893: variable 'ansible_search_path' from source: unknown 22286 1726882812.86927: calling self._execute() 22286 1726882812.87097: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882812.87133: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882812.87138: variable 'omit' from source: magic vars 22286 1726882812.87638: variable 'ansible_distribution_major_version' from source: facts 22286 1726882812.87663: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882812.88055: variable 'network_provider' from source: set_fact 22286 1726882812.88058: Evaluated conditional (network_provider == "initscripts"): False 22286 1726882812.88061: when evaluation is False, skipping this task 22286 1726882812.88063: _execute() done 22286 1726882812.88066: dumping result to json 22286 1726882812.88068: done dumping result, returning 22286 1726882812.88073: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affe814-3a2d-a75d-4836-00000000007d] 22286 1726882812.88092: sending task result for task 0affe814-3a2d-a75d-4836-00000000007d skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 22286 1726882812.88403: no more pending results, returning what we have 22286 1726882812.88412: results queue empty 22286 1726882812.88413: checking for any_errors_fatal 22286 1726882812.88433: done checking for any_errors_fatal 22286 1726882812.88436: checking for max_fail_percentage 22286 1726882812.88440: done checking for max_fail_percentage 22286 1726882812.88441: checking to see if all hosts have failed and the running result is not ok 22286 1726882812.88442: done checking to see if all hosts have failed 22286 1726882812.88443: getting the remaining hosts for this loop 22286 1726882812.88445: done getting the remaining hosts for this loop 22286 1726882812.88450: getting the next task for host managed_node3 22286 1726882812.88467: done getting next task for host managed_node3 22286 1726882812.88510: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 22286 1726882812.88516: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22286 1726882812.88602: getting variables 22286 1726882812.88604: in VariableManager get_vars() 22286 1726882812.88732: Calling all_inventory to load vars for managed_node3 22286 1726882812.88738: Calling groups_inventory to load vars for managed_node3 22286 1726882812.88773: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882812.88815: Calling all_plugins_play to load vars for managed_node3 22286 1726882812.88820: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882812.88824: Calling groups_plugins_play to load vars for managed_node3 22286 1726882812.89396: done sending task result for task 0affe814-3a2d-a75d-4836-00000000007d 22286 1726882812.89400: WORKER PROCESS EXITING 22286 1726882812.91677: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882812.96243: done with get_vars() 22286 1726882812.96297: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:40:12 -0400 (0:00:00.107) 0:00:36.358 ****** 22286 1726882812.96528: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 22286 1726882812.96955: worker is 1 (out of 1 available) 22286 1726882812.96968: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 22286 1726882812.96985: done queuing things up, now waiting for results queue to drain 22286 1726882812.96987: waiting for pending results... 22286 1726882812.97497: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 22286 1726882812.97770: in run() - task 0affe814-3a2d-a75d-4836-00000000007e 22286 1726882812.97819: variable 'ansible_search_path' from source: unknown 22286 1726882812.97869: variable 'ansible_search_path' from source: unknown 22286 1726882812.97979: calling self._execute() 22286 1726882812.98110: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882812.98178: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882812.98221: variable 'omit' from source: magic vars 22286 1726882812.99470: variable 'ansible_distribution_major_version' from source: facts 22286 1726882812.99474: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882812.99476: variable 'omit' from source: magic vars 22286 1726882812.99479: variable 'omit' from source: magic vars 22286 1726882812.99817: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22286 1726882813.02556: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22286 1726882813.02652: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22286 1726882813.02696: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22286 1726882813.02748: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22286 1726882813.02783: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22286 1726882813.02888: variable 'network_provider' from source: set_fact 22286 1726882813.03065: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22286 1726882813.03123: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22286 1726882813.03156: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882813.03211: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22286 1726882813.03234: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22286 1726882813.03328: variable 'omit' from source: magic vars 22286 1726882813.03489: variable 'omit' from source: magic vars 22286 1726882813.03628: variable 'network_connections' from source: task vars 22286 1726882813.03642: variable 'interface' from source: play vars 22286 1726882813.03727: variable 'interface' from source: play vars 22286 1726882813.03923: variable 'omit' from source: magic vars 22286 1726882813.03939: variable '__lsr_ansible_managed' from source: task vars 22286 1726882813.04019: variable '__lsr_ansible_managed' from source: task vars 22286 1726882813.04443: Loaded config def from plugin (lookup/template) 22286 1726882813.04447: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 22286 1726882813.04449: File lookup term: get_ansible_managed.j2 22286 1726882813.04452: variable 'ansible_search_path' from source: unknown 22286 1726882813.04455: evaluation_path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 22286 1726882813.04545: search_path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 22286 1726882813.04554: variable 'ansible_search_path' from source: unknown 22286 1726882813.23092: variable 'ansible_managed' from source: unknown 22286 1726882813.23617: variable 'omit' from source: magic vars 22286 1726882813.23763: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22286 1726882813.23800: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22286 1726882813.23821: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22286 1726882813.23844: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882813.23942: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882813.23986: variable 'inventory_hostname' from source: host vars for 'managed_node3' 22286 1726882813.23991: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882813.23994: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882813.24329: Set connection var ansible_shell_executable to /bin/sh 22286 1726882813.24332: Set connection var ansible_module_compression to ZIP_DEFLATED 22286 1726882813.24344: Set connection var ansible_connection to ssh 22286 1726882813.24347: Set connection var ansible_shell_type to sh 22286 1726882813.24358: Set connection var ansible_timeout to 10 22286 1726882813.24369: Set connection var ansible_pipelining to False 22286 1726882813.24523: variable 'ansible_shell_executable' from source: unknown 22286 1726882813.24526: variable 'ansible_connection' from source: unknown 22286 1726882813.24529: variable 'ansible_module_compression' from source: unknown 22286 1726882813.24531: variable 'ansible_shell_type' from source: unknown 22286 1726882813.24535: variable 'ansible_shell_executable' from source: unknown 22286 1726882813.24538: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882813.24544: variable 'ansible_pipelining' from source: unknown 22286 1726882813.24551: variable 'ansible_timeout' from source: unknown 22286 1726882813.24554: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882813.24955: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 22286 1726882813.24970: variable 'omit' from source: magic vars 22286 1726882813.24984: starting attempt loop 22286 1726882813.24988: running the handler 22286 1726882813.25000: _low_level_execute_command(): starting 22286 1726882813.25063: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22286 1726882813.26460: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882813.26536: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22286 1726882813.26540: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882813.26543: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882813.26678: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882813.26942: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882813.26950: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882813.28701: stdout chunk (state=3): >>>/root <<< 22286 1726882813.28840: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882813.28901: stderr chunk (state=3): >>><<< 22286 1726882813.28905: stdout chunk (state=3): >>><<< 22286 1726882813.29060: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882813.29073: _low_level_execute_command(): starting 22286 1726882813.29080: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882813.2905943-23614-40580169020140 `" && echo ansible-tmp-1726882813.2905943-23614-40580169020140="` echo /root/.ansible/tmp/ansible-tmp-1726882813.2905943-23614-40580169020140 `" ) && sleep 0' 22286 1726882813.30568: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882813.30572: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882813.30578: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 <<< 22286 1726882813.30580: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882813.30583: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882813.30690: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found <<< 22286 1726882813.30693: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882813.30795: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882813.30810: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882813.30963: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882813.31106: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882813.33226: stdout chunk (state=3): >>>ansible-tmp-1726882813.2905943-23614-40580169020140=/root/.ansible/tmp/ansible-tmp-1726882813.2905943-23614-40580169020140 <<< 22286 1726882813.33364: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882813.33441: stderr chunk (state=3): >>><<< 22286 1726882813.33445: stdout chunk (state=3): >>><<< 22286 1726882813.33531: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882813.2905943-23614-40580169020140=/root/.ansible/tmp/ansible-tmp-1726882813.2905943-23614-40580169020140 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882813.33584: variable 'ansible_module_compression' from source: unknown 22286 1726882813.33626: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22286nvkga28q/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 22286 1726882813.33674: variable 'ansible_facts' from source: unknown 22286 1726882813.34008: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882813.2905943-23614-40580169020140/AnsiballZ_network_connections.py 22286 1726882813.34361: Sending initial data 22286 1726882813.34364: Sent initial data (167 bytes) 22286 1726882813.35862: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882813.35869: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22286 1726882813.35930: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882813.36181: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882813.36186: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882813.36252: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882813.36339: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882813.38071: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 22286 1726882813.38089: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22286 1726882813.38227: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22286 1726882813.38383: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22286nvkga28q/tmpke29xder /root/.ansible/tmp/ansible-tmp-1726882813.2905943-23614-40580169020140/AnsiballZ_network_connections.py <<< 22286 1726882813.38393: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882813.2905943-23614-40580169020140/AnsiballZ_network_connections.py" <<< 22286 1726882813.38512: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-22286nvkga28q/tmpke29xder" to remote "/root/.ansible/tmp/ansible-tmp-1726882813.2905943-23614-40580169020140/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882813.2905943-23614-40580169020140/AnsiballZ_network_connections.py" <<< 22286 1726882813.41480: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882813.41512: stderr chunk (state=3): >>><<< 22286 1726882813.41536: stdout chunk (state=3): >>><<< 22286 1726882813.41707: done transferring module to remote 22286 1726882813.41711: _low_level_execute_command(): starting 22286 1726882813.41714: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882813.2905943-23614-40580169020140/ /root/.ansible/tmp/ansible-tmp-1726882813.2905943-23614-40580169020140/AnsiballZ_network_connections.py && sleep 0' 22286 1726882813.42351: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882813.42411: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882813.42467: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882813.42543: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882813.42714: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882813.44891: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882813.44898: stdout chunk (state=3): >>><<< 22286 1726882813.44900: stderr chunk (state=3): >>><<< 22286 1726882813.45051: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882813.45055: _low_level_execute_command(): starting 22286 1726882813.45058: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882813.2905943-23614-40580169020140/AnsiballZ_network_connections.py && sleep 0' 22286 1726882813.46288: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882813.46331: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882813.46339: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882813.46482: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882813.46769: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882813.84340: stdout chunk (state=3): >>>Traceback (most recent call last): <<< 22286 1726882813.84353: stdout chunk (state=3): >>> File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ytqv0qih/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ytqv0qih/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on veth0/c4d4cbe6-62c7-4ab2-a39e-4a93fe001a28: error=unknown <<< 22286 1726882813.84559: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 22286 1726882813.86630: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882813.86637: stderr chunk (state=3): >>>Shared connection to 10.31.41.238 closed. <<< 22286 1726882813.86693: stderr chunk (state=3): >>><<< 22286 1726882813.86705: stdout chunk (state=3): >>><<< 22286 1726882813.86727: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ytqv0qih/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ytqv0qih/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on veth0/c4d4cbe6-62c7-4ab2-a39e-4a93fe001a28: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. 22286 1726882813.86766: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'veth0', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882813.2905943-23614-40580169020140/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22286 1726882813.86782: _low_level_execute_command(): starting 22286 1726882813.86785: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882813.2905943-23614-40580169020140/ > /dev/null 2>&1 && sleep 0' 22286 1726882813.87204: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882813.87211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882813.87213: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882813.87216: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882813.87275: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882813.87279: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882813.87389: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882813.89577: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882813.89581: stdout chunk (state=3): >>><<< 22286 1726882813.89685: stderr chunk (state=3): >>><<< 22286 1726882813.89689: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882813.89696: handler run complete 22286 1726882813.89725: attempt loop complete, returning result 22286 1726882813.89774: _execute() done 22286 1726882813.89811: dumping result to json 22286 1726882813.89848: done dumping result, returning 22286 1726882813.89863: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affe814-3a2d-a75d-4836-00000000007e] 22286 1726882813.89901: sending task result for task 0affe814-3a2d-a75d-4836-00000000007e 22286 1726882813.90095: done sending task result for task 0affe814-3a2d-a75d-4836-00000000007e 22286 1726882813.90098: WORKER PROCESS EXITING changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "veth0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 22286 1726882813.90383: no more pending results, returning what we have 22286 1726882813.90387: results queue empty 22286 1726882813.90388: checking for any_errors_fatal 22286 1726882813.90396: done checking for any_errors_fatal 22286 1726882813.90397: checking for max_fail_percentage 22286 1726882813.90399: done checking for max_fail_percentage 22286 1726882813.90401: checking to see if all hosts have failed and the running result is not ok 22286 1726882813.90402: done checking to see if all hosts have failed 22286 1726882813.90403: getting the remaining hosts for this loop 22286 1726882813.90405: done getting the remaining hosts for this loop 22286 1726882813.90410: getting the next task for host managed_node3 22286 1726882813.90417: done getting next task for host managed_node3 22286 1726882813.90422: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 22286 1726882813.90426: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22286 1726882813.90644: getting variables 22286 1726882813.90646: in VariableManager get_vars() 22286 1726882813.90690: Calling all_inventory to load vars for managed_node3 22286 1726882813.90694: Calling groups_inventory to load vars for managed_node3 22286 1726882813.90697: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882813.90707: Calling all_plugins_play to load vars for managed_node3 22286 1726882813.90711: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882813.90715: Calling groups_plugins_play to load vars for managed_node3 22286 1726882813.93722: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882813.98519: done with get_vars() 22286 1726882813.98557: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:40:13 -0400 (0:00:01.021) 0:00:37.380 ****** 22286 1726882813.98667: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 22286 1726882813.99047: worker is 1 (out of 1 available) 22286 1726882813.99062: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 22286 1726882813.99079: done queuing things up, now waiting for results queue to drain 22286 1726882813.99081: waiting for pending results... 22286 1726882813.99486: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 22286 1726882813.99741: in run() - task 0affe814-3a2d-a75d-4836-00000000007f 22286 1726882813.99745: variable 'ansible_search_path' from source: unknown 22286 1726882813.99748: variable 'ansible_search_path' from source: unknown 22286 1726882813.99754: calling self._execute() 22286 1726882813.99803: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882813.99819: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882813.99846: variable 'omit' from source: magic vars 22286 1726882814.00310: variable 'ansible_distribution_major_version' from source: facts 22286 1726882814.00331: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882814.00496: variable 'network_state' from source: role '' defaults 22286 1726882814.00646: Evaluated conditional (network_state != {}): False 22286 1726882814.00677: when evaluation is False, skipping this task 22286 1726882814.00772: _execute() done 22286 1726882814.00781: dumping result to json 22286 1726882814.00864: done dumping result, returning 22286 1726882814.00868: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [0affe814-3a2d-a75d-4836-00000000007f] 22286 1726882814.00870: sending task result for task 0affe814-3a2d-a75d-4836-00000000007f skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 22286 1726882814.01235: no more pending results, returning what we have 22286 1726882814.01240: results queue empty 22286 1726882814.01241: checking for any_errors_fatal 22286 1726882814.01257: done checking for any_errors_fatal 22286 1726882814.01259: checking for max_fail_percentage 22286 1726882814.01261: done checking for max_fail_percentage 22286 1726882814.01266: checking to see if all hosts have failed and the running result is not ok 22286 1726882814.01267: done checking to see if all hosts have failed 22286 1726882814.01268: getting the remaining hosts for this loop 22286 1726882814.01270: done getting the remaining hosts for this loop 22286 1726882814.01277: getting the next task for host managed_node3 22286 1726882814.01286: done getting next task for host managed_node3 22286 1726882814.01290: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 22286 1726882814.01294: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22286 1726882814.01329: getting variables 22286 1726882814.01332: in VariableManager get_vars() 22286 1726882814.01707: Calling all_inventory to load vars for managed_node3 22286 1726882814.01711: Calling groups_inventory to load vars for managed_node3 22286 1726882814.01714: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882814.01725: Calling all_plugins_play to load vars for managed_node3 22286 1726882814.01728: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882814.01732: Calling groups_plugins_play to load vars for managed_node3 22286 1726882814.02374: done sending task result for task 0affe814-3a2d-a75d-4836-00000000007f 22286 1726882814.02380: WORKER PROCESS EXITING 22286 1726882814.05784: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882814.10486: done with get_vars() 22286 1726882814.10594: done getting variables 22286 1726882814.10669: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:40:14 -0400 (0:00:00.120) 0:00:37.500 ****** 22286 1726882814.10715: entering _queue_task() for managed_node3/debug 22286 1726882814.11532: worker is 1 (out of 1 available) 22286 1726882814.11548: exiting _queue_task() for managed_node3/debug 22286 1726882814.11666: done queuing things up, now waiting for results queue to drain 22286 1726882814.11668: waiting for pending results... 22286 1726882814.12095: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 22286 1726882814.12610: in run() - task 0affe814-3a2d-a75d-4836-000000000080 22286 1726882814.12631: variable 'ansible_search_path' from source: unknown 22286 1726882814.12638: variable 'ansible_search_path' from source: unknown 22286 1726882814.12738: calling self._execute() 22286 1726882814.13019: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882814.13031: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882814.13119: variable 'omit' from source: magic vars 22286 1726882814.14037: variable 'ansible_distribution_major_version' from source: facts 22286 1726882814.14052: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882814.14056: variable 'omit' from source: magic vars 22286 1726882814.14306: variable 'omit' from source: magic vars 22286 1726882814.14482: variable 'omit' from source: magic vars 22286 1726882814.14514: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22286 1726882814.14583: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22286 1726882814.14605: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22286 1726882814.14640: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882814.14783: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882814.14829: variable 'inventory_hostname' from source: host vars for 'managed_node3' 22286 1726882814.14833: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882814.14842: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882814.15224: Set connection var ansible_shell_executable to /bin/sh 22286 1726882814.15228: Set connection var ansible_module_compression to ZIP_DEFLATED 22286 1726882814.15294: Set connection var ansible_connection to ssh 22286 1726882814.15298: Set connection var ansible_shell_type to sh 22286 1726882814.15351: Set connection var ansible_timeout to 10 22286 1726882814.15538: Set connection var ansible_pipelining to False 22286 1726882814.15542: variable 'ansible_shell_executable' from source: unknown 22286 1726882814.15545: variable 'ansible_connection' from source: unknown 22286 1726882814.15548: variable 'ansible_module_compression' from source: unknown 22286 1726882814.15550: variable 'ansible_shell_type' from source: unknown 22286 1726882814.15552: variable 'ansible_shell_executable' from source: unknown 22286 1726882814.15560: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882814.15562: variable 'ansible_pipelining' from source: unknown 22286 1726882814.15565: variable 'ansible_timeout' from source: unknown 22286 1726882814.15567: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882814.15964: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22286 1726882814.16058: variable 'omit' from source: magic vars 22286 1726882814.16062: starting attempt loop 22286 1726882814.16065: running the handler 22286 1726882814.16583: variable '__network_connections_result' from source: set_fact 22286 1726882814.16586: handler run complete 22286 1726882814.16589: attempt loop complete, returning result 22286 1726882814.16591: _execute() done 22286 1726882814.16593: dumping result to json 22286 1726882814.16595: done dumping result, returning 22286 1726882814.16598: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affe814-3a2d-a75d-4836-000000000080] 22286 1726882814.16600: sending task result for task 0affe814-3a2d-a75d-4836-000000000080 ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "" ] } 22286 1726882814.16841: no more pending results, returning what we have 22286 1726882814.16846: results queue empty 22286 1726882814.16847: checking for any_errors_fatal 22286 1726882814.16857: done checking for any_errors_fatal 22286 1726882814.16858: checking for max_fail_percentage 22286 1726882814.16860: done checking for max_fail_percentage 22286 1726882814.16864: checking to see if all hosts have failed and the running result is not ok 22286 1726882814.16865: done checking to see if all hosts have failed 22286 1726882814.16866: getting the remaining hosts for this loop 22286 1726882814.16868: done getting the remaining hosts for this loop 22286 1726882814.16871: getting the next task for host managed_node3 22286 1726882814.16879: done getting next task for host managed_node3 22286 1726882814.16883: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 22286 1726882814.16888: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22286 1726882814.16900: getting variables 22286 1726882814.16902: in VariableManager get_vars() 22286 1726882814.17039: Calling all_inventory to load vars for managed_node3 22286 1726882814.17043: Calling groups_inventory to load vars for managed_node3 22286 1726882814.17046: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882814.17057: Calling all_plugins_play to load vars for managed_node3 22286 1726882814.17067: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882814.17072: Calling groups_plugins_play to load vars for managed_node3 22286 1726882814.17592: done sending task result for task 0affe814-3a2d-a75d-4836-000000000080 22286 1726882814.17596: WORKER PROCESS EXITING 22286 1726882814.20369: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882814.24841: done with get_vars() 22286 1726882814.24893: done getting variables 22286 1726882814.24978: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:40:14 -0400 (0:00:00.143) 0:00:37.643 ****** 22286 1726882814.25020: entering _queue_task() for managed_node3/debug 22286 1726882814.25465: worker is 1 (out of 1 available) 22286 1726882814.25483: exiting _queue_task() for managed_node3/debug 22286 1726882814.25613: done queuing things up, now waiting for results queue to drain 22286 1726882814.25616: waiting for pending results... 22286 1726882814.25937: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 22286 1726882814.25991: in run() - task 0affe814-3a2d-a75d-4836-000000000081 22286 1726882814.26009: variable 'ansible_search_path' from source: unknown 22286 1726882814.26013: variable 'ansible_search_path' from source: unknown 22286 1726882814.26089: calling self._execute() 22286 1726882814.26187: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882814.26240: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882814.26258: variable 'omit' from source: magic vars 22286 1726882814.26679: variable 'ansible_distribution_major_version' from source: facts 22286 1726882814.26710: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882814.26714: variable 'omit' from source: magic vars 22286 1726882814.26786: variable 'omit' from source: magic vars 22286 1726882814.26858: variable 'omit' from source: magic vars 22286 1726882814.26967: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22286 1726882814.27018: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22286 1726882814.27040: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22286 1726882814.27293: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882814.27363: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882814.27367: variable 'inventory_hostname' from source: host vars for 'managed_node3' 22286 1726882814.27370: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882814.27372: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882814.27695: Set connection var ansible_shell_executable to /bin/sh 22286 1726882814.27706: Set connection var ansible_module_compression to ZIP_DEFLATED 22286 1726882814.27710: Set connection var ansible_connection to ssh 22286 1726882814.27712: Set connection var ansible_shell_type to sh 22286 1726882814.27722: Set connection var ansible_timeout to 10 22286 1726882814.27801: Set connection var ansible_pipelining to False 22286 1726882814.27871: variable 'ansible_shell_executable' from source: unknown 22286 1726882814.27875: variable 'ansible_connection' from source: unknown 22286 1726882814.27881: variable 'ansible_module_compression' from source: unknown 22286 1726882814.27884: variable 'ansible_shell_type' from source: unknown 22286 1726882814.27889: variable 'ansible_shell_executable' from source: unknown 22286 1726882814.27893: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882814.27898: variable 'ansible_pipelining' from source: unknown 22286 1726882814.27907: variable 'ansible_timeout' from source: unknown 22286 1726882814.27909: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882814.28236: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22286 1726882814.28305: variable 'omit' from source: magic vars 22286 1726882814.28312: starting attempt loop 22286 1726882814.28315: running the handler 22286 1726882814.28452: variable '__network_connections_result' from source: set_fact 22286 1726882814.28638: variable '__network_connections_result' from source: set_fact 22286 1726882814.28817: handler run complete 22286 1726882814.28858: attempt loop complete, returning result 22286 1726882814.28861: _execute() done 22286 1726882814.28864: dumping result to json 22286 1726882814.28881: done dumping result, returning 22286 1726882814.28889: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affe814-3a2d-a75d-4836-000000000081] 22286 1726882814.28900: sending task result for task 0affe814-3a2d-a75d-4836-000000000081 22286 1726882814.29121: done sending task result for task 0affe814-3a2d-a75d-4836-000000000081 22286 1726882814.29125: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "veth0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 22286 1726882814.29233: no more pending results, returning what we have 22286 1726882814.29238: results queue empty 22286 1726882814.29239: checking for any_errors_fatal 22286 1726882814.29245: done checking for any_errors_fatal 22286 1726882814.29246: checking for max_fail_percentage 22286 1726882814.29248: done checking for max_fail_percentage 22286 1726882814.29249: checking to see if all hosts have failed and the running result is not ok 22286 1726882814.29250: done checking to see if all hosts have failed 22286 1726882814.29253: getting the remaining hosts for this loop 22286 1726882814.29255: done getting the remaining hosts for this loop 22286 1726882814.29263: getting the next task for host managed_node3 22286 1726882814.29270: done getting next task for host managed_node3 22286 1726882814.29274: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 22286 1726882814.29281: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22286 1726882814.29300: getting variables 22286 1726882814.29302: in VariableManager get_vars() 22286 1726882814.29468: Calling all_inventory to load vars for managed_node3 22286 1726882814.29472: Calling groups_inventory to load vars for managed_node3 22286 1726882814.29478: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882814.29493: Calling all_plugins_play to load vars for managed_node3 22286 1726882814.29497: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882814.29502: Calling groups_plugins_play to load vars for managed_node3 22286 1726882814.31944: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882814.35669: done with get_vars() 22286 1726882814.35705: done getting variables 22286 1726882814.35776: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:40:14 -0400 (0:00:00.107) 0:00:37.751 ****** 22286 1726882814.35820: entering _queue_task() for managed_node3/debug 22286 1726882814.36179: worker is 1 (out of 1 available) 22286 1726882814.36194: exiting _queue_task() for managed_node3/debug 22286 1726882814.36214: done queuing things up, now waiting for results queue to drain 22286 1726882814.36216: waiting for pending results... 22286 1726882814.36588: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 22286 1726882814.36782: in run() - task 0affe814-3a2d-a75d-4836-000000000082 22286 1726882814.36895: variable 'ansible_search_path' from source: unknown 22286 1726882814.36899: variable 'ansible_search_path' from source: unknown 22286 1726882814.36903: calling self._execute() 22286 1726882814.36967: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882814.36985: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882814.37011: variable 'omit' from source: magic vars 22286 1726882814.37533: variable 'ansible_distribution_major_version' from source: facts 22286 1726882814.37564: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882814.37737: variable 'network_state' from source: role '' defaults 22286 1726882814.37761: Evaluated conditional (network_state != {}): False 22286 1726882814.37782: when evaluation is False, skipping this task 22286 1726882814.37785: _execute() done 22286 1726882814.37871: dumping result to json 22286 1726882814.37874: done dumping result, returning 22286 1726882814.37881: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affe814-3a2d-a75d-4836-000000000082] 22286 1726882814.37885: sending task result for task 0affe814-3a2d-a75d-4836-000000000082 skipping: [managed_node3] => { "false_condition": "network_state != {}" } 22286 1726882814.38012: no more pending results, returning what we have 22286 1726882814.38016: results queue empty 22286 1726882814.38017: checking for any_errors_fatal 22286 1726882814.38029: done checking for any_errors_fatal 22286 1726882814.38030: checking for max_fail_percentage 22286 1726882814.38033: done checking for max_fail_percentage 22286 1726882814.38036: checking to see if all hosts have failed and the running result is not ok 22286 1726882814.38038: done checking to see if all hosts have failed 22286 1726882814.38038: getting the remaining hosts for this loop 22286 1726882814.38041: done getting the remaining hosts for this loop 22286 1726882814.38045: getting the next task for host managed_node3 22286 1726882814.38053: done getting next task for host managed_node3 22286 1726882814.38059: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 22286 1726882814.38064: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22286 1726882814.38205: getting variables 22286 1726882814.38207: in VariableManager get_vars() 22286 1726882814.38321: Calling all_inventory to load vars for managed_node3 22286 1726882814.38325: Calling groups_inventory to load vars for managed_node3 22286 1726882814.38328: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882814.38337: done sending task result for task 0affe814-3a2d-a75d-4836-000000000082 22286 1726882814.38341: WORKER PROCESS EXITING 22286 1726882814.38350: Calling all_plugins_play to load vars for managed_node3 22286 1726882814.38355: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882814.38359: Calling groups_plugins_play to load vars for managed_node3 22286 1726882814.41424: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882814.47755: done with get_vars() 22286 1726882814.47808: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:40:14 -0400 (0:00:00.123) 0:00:37.874 ****** 22286 1726882814.48131: entering _queue_task() for managed_node3/ping 22286 1726882814.48607: worker is 1 (out of 1 available) 22286 1726882814.48620: exiting _queue_task() for managed_node3/ping 22286 1726882814.48637: done queuing things up, now waiting for results queue to drain 22286 1726882814.48638: waiting for pending results... 22286 1726882814.48957: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 22286 1726882814.49315: in run() - task 0affe814-3a2d-a75d-4836-000000000083 22286 1726882814.49319: variable 'ansible_search_path' from source: unknown 22286 1726882814.49323: variable 'ansible_search_path' from source: unknown 22286 1726882814.49326: calling self._execute() 22286 1726882814.49330: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882814.49332: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882814.49340: variable 'omit' from source: magic vars 22286 1726882814.49929: variable 'ansible_distribution_major_version' from source: facts 22286 1726882814.49944: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882814.49952: variable 'omit' from source: magic vars 22286 1726882814.50033: variable 'omit' from source: magic vars 22286 1726882814.50080: variable 'omit' from source: magic vars 22286 1726882814.50136: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22286 1726882814.50178: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22286 1726882814.50197: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22286 1726882814.50229: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882814.50316: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882814.50321: variable 'inventory_hostname' from source: host vars for 'managed_node3' 22286 1726882814.50325: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882814.50328: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882814.50436: Set connection var ansible_shell_executable to /bin/sh 22286 1726882814.50451: Set connection var ansible_module_compression to ZIP_DEFLATED 22286 1726882814.50455: Set connection var ansible_connection to ssh 22286 1726882814.50458: Set connection var ansible_shell_type to sh 22286 1726882814.50467: Set connection var ansible_timeout to 10 22286 1726882814.50480: Set connection var ansible_pipelining to False 22286 1726882814.50506: variable 'ansible_shell_executable' from source: unknown 22286 1726882814.50509: variable 'ansible_connection' from source: unknown 22286 1726882814.50514: variable 'ansible_module_compression' from source: unknown 22286 1726882814.50517: variable 'ansible_shell_type' from source: unknown 22286 1726882814.50531: variable 'ansible_shell_executable' from source: unknown 22286 1726882814.50534: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882814.50538: variable 'ansible_pipelining' from source: unknown 22286 1726882814.50540: variable 'ansible_timeout' from source: unknown 22286 1726882814.50642: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882814.50800: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 22286 1726882814.50812: variable 'omit' from source: magic vars 22286 1726882814.50818: starting attempt loop 22286 1726882814.50821: running the handler 22286 1726882814.50841: _low_level_execute_command(): starting 22286 1726882814.50859: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22286 1726882814.52042: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882814.52047: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882814.52141: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882814.52270: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882814.54099: stdout chunk (state=3): >>>/root <<< 22286 1726882814.54379: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882814.54383: stdout chunk (state=3): >>><<< 22286 1726882814.54386: stderr chunk (state=3): >>><<< 22286 1726882814.54389: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882814.54391: _low_level_execute_command(): starting 22286 1726882814.54395: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882814.5431406-23678-102124931120393 `" && echo ansible-tmp-1726882814.5431406-23678-102124931120393="` echo /root/.ansible/tmp/ansible-tmp-1726882814.5431406-23678-102124931120393 `" ) && sleep 0' 22286 1726882814.54925: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882814.54935: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882814.54953: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882814.54964: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22286 1726882814.54978: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 <<< 22286 1726882814.55031: stderr chunk (state=3): >>>debug2: match not found <<< 22286 1726882814.55035: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882814.55048: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 22286 1726882814.55051: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.41.238 is address <<< 22286 1726882814.55070: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 22286 1726882814.55073: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882814.55076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882814.55078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22286 1726882814.55080: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 <<< 22286 1726882814.55170: stderr chunk (state=3): >>>debug2: match found <<< 22286 1726882814.55173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882814.55178: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882814.55189: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882814.55212: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882814.55352: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882814.57781: stdout chunk (state=3): >>>ansible-tmp-1726882814.5431406-23678-102124931120393=/root/.ansible/tmp/ansible-tmp-1726882814.5431406-23678-102124931120393 <<< 22286 1726882814.57785: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882814.57787: stdout chunk (state=3): >>><<< 22286 1726882814.57790: stderr chunk (state=3): >>><<< 22286 1726882814.57793: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882814.5431406-23678-102124931120393=/root/.ansible/tmp/ansible-tmp-1726882814.5431406-23678-102124931120393 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882814.57979: variable 'ansible_module_compression' from source: unknown 22286 1726882814.58049: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22286nvkga28q/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 22286 1726882814.58173: variable 'ansible_facts' from source: unknown 22286 1726882814.58254: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882814.5431406-23678-102124931120393/AnsiballZ_ping.py 22286 1726882814.58613: Sending initial data 22286 1726882814.58617: Sent initial data (153 bytes) 22286 1726882814.60153: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882814.60507: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882814.62087: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 22286 1726882814.62095: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22286 1726882814.62197: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22286 1726882814.62308: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22286nvkga28q/tmpo64yc4zn /root/.ansible/tmp/ansible-tmp-1726882814.5431406-23678-102124931120393/AnsiballZ_ping.py <<< 22286 1726882814.62313: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882814.5431406-23678-102124931120393/AnsiballZ_ping.py" <<< 22286 1726882814.62518: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-22286nvkga28q/tmpo64yc4zn" to remote "/root/.ansible/tmp/ansible-tmp-1726882814.5431406-23678-102124931120393/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882814.5431406-23678-102124931120393/AnsiballZ_ping.py" <<< 22286 1726882814.64898: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882814.64914: stdout chunk (state=3): >>><<< 22286 1726882814.64931: stderr chunk (state=3): >>><<< 22286 1726882814.65006: done transferring module to remote 22286 1726882814.65214: _low_level_execute_command(): starting 22286 1726882814.65218: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882814.5431406-23678-102124931120393/ /root/.ansible/tmp/ansible-tmp-1726882814.5431406-23678-102124931120393/AnsiballZ_ping.py && sleep 0' 22286 1726882814.66452: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882814.66621: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882814.66635: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882814.66660: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882814.66805: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882814.68845: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882814.68953: stderr chunk (state=3): >>><<< 22286 1726882814.68965: stdout chunk (state=3): >>><<< 22286 1726882814.69055: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882814.69066: _low_level_execute_command(): starting 22286 1726882814.69163: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882814.5431406-23678-102124931120393/AnsiballZ_ping.py && sleep 0' 22286 1726882814.70301: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882814.70305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found <<< 22286 1726882814.70414: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882814.70632: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882814.70770: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882814.88292: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 22286 1726882814.89756: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. <<< 22286 1726882814.89761: stderr chunk (state=3): >>><<< 22286 1726882814.89770: stdout chunk (state=3): >>><<< 22286 1726882814.89810: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. 22286 1726882814.89832: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882814.5431406-23678-102124931120393/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22286 1726882814.89848: _low_level_execute_command(): starting 22286 1726882814.89888: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882814.5431406-23678-102124931120393/ > /dev/null 2>&1 && sleep 0' 22286 1726882814.91060: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882814.91395: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 22286 1726882814.91454: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882814.91597: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882814.93624: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882814.93696: stderr chunk (state=3): >>><<< 22286 1726882814.93702: stdout chunk (state=3): >>><<< 22286 1726882814.93725: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882814.93733: handler run complete 22286 1726882814.93755: attempt loop complete, returning result 22286 1726882814.93758: _execute() done 22286 1726882814.93761: dumping result to json 22286 1726882814.93767: done dumping result, returning 22286 1726882814.93781: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affe814-3a2d-a75d-4836-000000000083] 22286 1726882814.93786: sending task result for task 0affe814-3a2d-a75d-4836-000000000083 22286 1726882814.93898: done sending task result for task 0affe814-3a2d-a75d-4836-000000000083 22286 1726882814.93901: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 22286 1726882814.94005: no more pending results, returning what we have 22286 1726882814.94008: results queue empty 22286 1726882814.94009: checking for any_errors_fatal 22286 1726882814.94016: done checking for any_errors_fatal 22286 1726882814.94017: checking for max_fail_percentage 22286 1726882814.94019: done checking for max_fail_percentage 22286 1726882814.94020: checking to see if all hosts have failed and the running result is not ok 22286 1726882814.94021: done checking to see if all hosts have failed 22286 1726882814.94022: getting the remaining hosts for this loop 22286 1726882814.94024: done getting the remaining hosts for this loop 22286 1726882814.94028: getting the next task for host managed_node3 22286 1726882814.94043: done getting next task for host managed_node3 22286 1726882814.94045: ^ task is: TASK: meta (role_complete) 22286 1726882814.94049: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22286 1726882814.94061: getting variables 22286 1726882814.94063: in VariableManager get_vars() 22286 1726882814.94223: Calling all_inventory to load vars for managed_node3 22286 1726882814.94227: Calling groups_inventory to load vars for managed_node3 22286 1726882814.94230: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882814.94244: Calling all_plugins_play to load vars for managed_node3 22286 1726882814.94248: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882814.94253: Calling groups_plugins_play to load vars for managed_node3 22286 1726882814.97245: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882814.99859: done with get_vars() 22286 1726882814.99894: done getting variables 22286 1726882814.99994: done queuing things up, now waiting for results queue to drain 22286 1726882814.99997: results queue empty 22286 1726882814.99998: checking for any_errors_fatal 22286 1726882815.00003: done checking for any_errors_fatal 22286 1726882815.00004: checking for max_fail_percentage 22286 1726882815.00006: done checking for max_fail_percentage 22286 1726882815.00007: checking to see if all hosts have failed and the running result is not ok 22286 1726882815.00007: done checking to see if all hosts have failed 22286 1726882815.00008: getting the remaining hosts for this loop 22286 1726882815.00009: done getting the remaining hosts for this loop 22286 1726882815.00013: getting the next task for host managed_node3 22286 1726882815.00022: done getting next task for host managed_node3 22286 1726882815.00025: ^ task is: TASK: Include the task 'manage_test_interface.yml' 22286 1726882815.00027: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=4, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882815.00030: getting variables 22286 1726882815.00032: in VariableManager get_vars() 22286 1726882815.00051: Calling all_inventory to load vars for managed_node3 22286 1726882815.00053: Calling groups_inventory to load vars for managed_node3 22286 1726882815.00055: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882815.00059: Calling all_plugins_play to load vars for managed_node3 22286 1726882815.00061: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882815.00063: Calling groups_plugins_play to load vars for managed_node3 22286 1726882815.01608: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882815.04221: done with get_vars() 22286 1726882815.04256: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:104 Friday 20 September 2024 21:40:15 -0400 (0:00:00.562) 0:00:38.436 ****** 22286 1726882815.04342: entering _queue_task() for managed_node3/include_tasks 22286 1726882815.04700: worker is 1 (out of 1 available) 22286 1726882815.04715: exiting _queue_task() for managed_node3/include_tasks 22286 1726882815.04730: done queuing things up, now waiting for results queue to drain 22286 1726882815.04731: waiting for pending results... 22286 1726882815.05206: running TaskExecutor() for managed_node3/TASK: Include the task 'manage_test_interface.yml' 22286 1726882815.05213: in run() - task 0affe814-3a2d-a75d-4836-0000000000b3 22286 1726882815.05217: variable 'ansible_search_path' from source: unknown 22286 1726882815.05220: calling self._execute() 22286 1726882815.05463: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882815.05467: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882815.05474: variable 'omit' from source: magic vars 22286 1726882815.05818: variable 'ansible_distribution_major_version' from source: facts 22286 1726882815.05836: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882815.05841: _execute() done 22286 1726882815.05844: dumping result to json 22286 1726882815.05850: done dumping result, returning 22286 1726882815.05858: done running TaskExecutor() for managed_node3/TASK: Include the task 'manage_test_interface.yml' [0affe814-3a2d-a75d-4836-0000000000b3] 22286 1726882815.05866: sending task result for task 0affe814-3a2d-a75d-4836-0000000000b3 22286 1726882815.06098: done sending task result for task 0affe814-3a2d-a75d-4836-0000000000b3 22286 1726882815.06118: WORKER PROCESS EXITING 22286 1726882815.06150: no more pending results, returning what we have 22286 1726882815.06156: in VariableManager get_vars() 22286 1726882815.06201: Calling all_inventory to load vars for managed_node3 22286 1726882815.06205: Calling groups_inventory to load vars for managed_node3 22286 1726882815.06208: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882815.06220: Calling all_plugins_play to load vars for managed_node3 22286 1726882815.06225: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882815.06229: Calling groups_plugins_play to load vars for managed_node3 22286 1726882815.07527: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882815.09104: done with get_vars() 22286 1726882815.09143: variable 'ansible_search_path' from source: unknown 22286 1726882815.09158: we have included files to process 22286 1726882815.09159: generating all_blocks data 22286 1726882815.09161: done generating all_blocks data 22286 1726882815.09167: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 22286 1726882815.09168: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 22286 1726882815.09171: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 22286 1726882815.10168: in VariableManager get_vars() 22286 1726882815.10205: done with get_vars() 22286 1726882815.10925: done processing included file 22286 1726882815.10927: iterating over new_blocks loaded from include file 22286 1726882815.10929: in VariableManager get_vars() 22286 1726882815.10948: done with get_vars() 22286 1726882815.10954: filtering new block on tags 22286 1726882815.10991: done filtering new block on tags 22286 1726882815.10993: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed_node3 22286 1726882815.10998: extending task lists for all hosts with included blocks 22286 1726882815.14009: done extending task lists 22286 1726882815.14011: done processing included files 22286 1726882815.14011: results queue empty 22286 1726882815.14012: checking for any_errors_fatal 22286 1726882815.14013: done checking for any_errors_fatal 22286 1726882815.14014: checking for max_fail_percentage 22286 1726882815.14015: done checking for max_fail_percentage 22286 1726882815.14015: checking to see if all hosts have failed and the running result is not ok 22286 1726882815.14016: done checking to see if all hosts have failed 22286 1726882815.14017: getting the remaining hosts for this loop 22286 1726882815.14018: done getting the remaining hosts for this loop 22286 1726882815.14020: getting the next task for host managed_node3 22286 1726882815.14023: done getting next task for host managed_node3 22286 1726882815.14025: ^ task is: TASK: Ensure state in ["present", "absent"] 22286 1726882815.14027: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22286 1726882815.14029: getting variables 22286 1726882815.14030: in VariableManager get_vars() 22286 1726882815.14042: Calling all_inventory to load vars for managed_node3 22286 1726882815.14048: Calling groups_inventory to load vars for managed_node3 22286 1726882815.14050: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882815.14055: Calling all_plugins_play to load vars for managed_node3 22286 1726882815.14057: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882815.14059: Calling groups_plugins_play to load vars for managed_node3 22286 1726882815.15409: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882815.18750: done with get_vars() 22286 1726882815.18785: done getting variables 22286 1726882815.18844: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Friday 20 September 2024 21:40:15 -0400 (0:00:00.145) 0:00:38.582 ****** 22286 1726882815.18879: entering _queue_task() for managed_node3/fail 22286 1726882815.19274: worker is 1 (out of 1 available) 22286 1726882815.19288: exiting _queue_task() for managed_node3/fail 22286 1726882815.19305: done queuing things up, now waiting for results queue to drain 22286 1726882815.19306: waiting for pending results... 22286 1726882815.19680: running TaskExecutor() for managed_node3/TASK: Ensure state in ["present", "absent"] 22286 1726882815.19791: in run() - task 0affe814-3a2d-a75d-4836-0000000005cc 22286 1726882815.19797: variable 'ansible_search_path' from source: unknown 22286 1726882815.19801: variable 'ansible_search_path' from source: unknown 22286 1726882815.19832: calling self._execute() 22286 1726882815.20043: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882815.20049: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882815.20053: variable 'omit' from source: magic vars 22286 1726882815.20536: variable 'ansible_distribution_major_version' from source: facts 22286 1726882815.20552: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882815.20748: variable 'state' from source: include params 22286 1726882815.20755: Evaluated conditional (state not in ["present", "absent"]): False 22286 1726882815.20827: when evaluation is False, skipping this task 22286 1726882815.20833: _execute() done 22286 1726882815.20839: dumping result to json 22286 1726882815.20841: done dumping result, returning 22286 1726882815.20844: done running TaskExecutor() for managed_node3/TASK: Ensure state in ["present", "absent"] [0affe814-3a2d-a75d-4836-0000000005cc] 22286 1726882815.20846: sending task result for task 0affe814-3a2d-a75d-4836-0000000005cc 22286 1726882815.20916: done sending task result for task 0affe814-3a2d-a75d-4836-0000000005cc 22286 1726882815.20919: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 22286 1726882815.20983: no more pending results, returning what we have 22286 1726882815.20988: results queue empty 22286 1726882815.20989: checking for any_errors_fatal 22286 1726882815.20991: done checking for any_errors_fatal 22286 1726882815.20992: checking for max_fail_percentage 22286 1726882815.20994: done checking for max_fail_percentage 22286 1726882815.20995: checking to see if all hosts have failed and the running result is not ok 22286 1726882815.20996: done checking to see if all hosts have failed 22286 1726882815.20997: getting the remaining hosts for this loop 22286 1726882815.20999: done getting the remaining hosts for this loop 22286 1726882815.21003: getting the next task for host managed_node3 22286 1726882815.21010: done getting next task for host managed_node3 22286 1726882815.21013: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 22286 1726882815.21017: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22286 1726882815.21022: getting variables 22286 1726882815.21023: in VariableManager get_vars() 22286 1726882815.21064: Calling all_inventory to load vars for managed_node3 22286 1726882815.21068: Calling groups_inventory to load vars for managed_node3 22286 1726882815.21070: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882815.21082: Calling all_plugins_play to load vars for managed_node3 22286 1726882815.21086: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882815.21090: Calling groups_plugins_play to load vars for managed_node3 22286 1726882815.31503: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882815.36477: done with get_vars() 22286 1726882815.36522: done getting variables 22286 1726882815.36620: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Friday 20 September 2024 21:40:15 -0400 (0:00:00.177) 0:00:38.759 ****** 22286 1726882815.36659: entering _queue_task() for managed_node3/fail 22286 1726882815.37069: worker is 1 (out of 1 available) 22286 1726882815.37083: exiting _queue_task() for managed_node3/fail 22286 1726882815.37099: done queuing things up, now waiting for results queue to drain 22286 1726882815.37101: waiting for pending results... 22286 1726882815.37561: running TaskExecutor() for managed_node3/TASK: Ensure type in ["dummy", "tap", "veth"] 22286 1726882815.37595: in run() - task 0affe814-3a2d-a75d-4836-0000000005cd 22286 1726882815.37620: variable 'ansible_search_path' from source: unknown 22286 1726882815.37671: variable 'ansible_search_path' from source: unknown 22286 1726882815.37700: calling self._execute() 22286 1726882815.37833: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882815.37853: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882815.37888: variable 'omit' from source: magic vars 22286 1726882815.38433: variable 'ansible_distribution_major_version' from source: facts 22286 1726882815.38442: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882815.38633: variable 'type' from source: play vars 22286 1726882815.38653: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 22286 1726882815.38668: when evaluation is False, skipping this task 22286 1726882815.38762: _execute() done 22286 1726882815.38767: dumping result to json 22286 1726882815.38771: done dumping result, returning 22286 1726882815.38773: done running TaskExecutor() for managed_node3/TASK: Ensure type in ["dummy", "tap", "veth"] [0affe814-3a2d-a75d-4836-0000000005cd] 22286 1726882815.38779: sending task result for task 0affe814-3a2d-a75d-4836-0000000005cd 22286 1726882815.38857: done sending task result for task 0affe814-3a2d-a75d-4836-0000000005cd 22286 1726882815.38860: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 22286 1726882815.38924: no more pending results, returning what we have 22286 1726882815.38928: results queue empty 22286 1726882815.38929: checking for any_errors_fatal 22286 1726882815.38941: done checking for any_errors_fatal 22286 1726882815.38943: checking for max_fail_percentage 22286 1726882815.38945: done checking for max_fail_percentage 22286 1726882815.38946: checking to see if all hosts have failed and the running result is not ok 22286 1726882815.38947: done checking to see if all hosts have failed 22286 1726882815.38948: getting the remaining hosts for this loop 22286 1726882815.38951: done getting the remaining hosts for this loop 22286 1726882815.38956: getting the next task for host managed_node3 22286 1726882815.38964: done getting next task for host managed_node3 22286 1726882815.38967: ^ task is: TASK: Include the task 'show_interfaces.yml' 22286 1726882815.38971: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22286 1726882815.38979: getting variables 22286 1726882815.38981: in VariableManager get_vars() 22286 1726882815.39029: Calling all_inventory to load vars for managed_node3 22286 1726882815.39033: Calling groups_inventory to load vars for managed_node3 22286 1726882815.39141: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882815.39161: Calling all_plugins_play to load vars for managed_node3 22286 1726882815.39165: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882815.39170: Calling groups_plugins_play to load vars for managed_node3 22286 1726882815.42759: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882815.46538: done with get_vars() 22286 1726882815.46574: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Friday 20 September 2024 21:40:15 -0400 (0:00:00.100) 0:00:38.860 ****** 22286 1726882815.46689: entering _queue_task() for managed_node3/include_tasks 22286 1726882815.47007: worker is 1 (out of 1 available) 22286 1726882815.47020: exiting _queue_task() for managed_node3/include_tasks 22286 1726882815.47036: done queuing things up, now waiting for results queue to drain 22286 1726882815.47038: waiting for pending results... 22286 1726882815.47455: running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' 22286 1726882815.47474: in run() - task 0affe814-3a2d-a75d-4836-0000000005ce 22286 1726882815.47498: variable 'ansible_search_path' from source: unknown 22286 1726882815.47662: variable 'ansible_search_path' from source: unknown 22286 1726882815.47666: calling self._execute() 22286 1726882815.47829: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882815.47892: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882815.48094: variable 'omit' from source: magic vars 22286 1726882815.48910: variable 'ansible_distribution_major_version' from source: facts 22286 1726882815.48929: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882815.48944: _execute() done 22286 1726882815.48954: dumping result to json 22286 1726882815.49180: done dumping result, returning 22286 1726882815.49184: done running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' [0affe814-3a2d-a75d-4836-0000000005ce] 22286 1726882815.49187: sending task result for task 0affe814-3a2d-a75d-4836-0000000005ce 22286 1726882815.49263: done sending task result for task 0affe814-3a2d-a75d-4836-0000000005ce 22286 1726882815.49266: WORKER PROCESS EXITING 22286 1726882815.49311: no more pending results, returning what we have 22286 1726882815.49316: in VariableManager get_vars() 22286 1726882815.49372: Calling all_inventory to load vars for managed_node3 22286 1726882815.49378: Calling groups_inventory to load vars for managed_node3 22286 1726882815.49382: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882815.49398: Calling all_plugins_play to load vars for managed_node3 22286 1726882815.49402: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882815.49406: Calling groups_plugins_play to load vars for managed_node3 22286 1726882815.54350: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882815.58839: done with get_vars() 22286 1726882815.58873: variable 'ansible_search_path' from source: unknown 22286 1726882815.58877: variable 'ansible_search_path' from source: unknown 22286 1726882815.58924: we have included files to process 22286 1726882815.58926: generating all_blocks data 22286 1726882815.58928: done generating all_blocks data 22286 1726882815.58938: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 22286 1726882815.58940: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 22286 1726882815.58943: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 22286 1726882815.59082: in VariableManager get_vars() 22286 1726882815.59116: done with get_vars() 22286 1726882815.59257: done processing included file 22286 1726882815.59260: iterating over new_blocks loaded from include file 22286 1726882815.59263: in VariableManager get_vars() 22286 1726882815.59291: done with get_vars() 22286 1726882815.59294: filtering new block on tags 22286 1726882815.59317: done filtering new block on tags 22286 1726882815.59319: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node3 22286 1726882815.59325: extending task lists for all hosts with included blocks 22286 1726882815.59918: done extending task lists 22286 1726882815.59919: done processing included files 22286 1726882815.59920: results queue empty 22286 1726882815.59921: checking for any_errors_fatal 22286 1726882815.59925: done checking for any_errors_fatal 22286 1726882815.59926: checking for max_fail_percentage 22286 1726882815.59928: done checking for max_fail_percentage 22286 1726882815.59929: checking to see if all hosts have failed and the running result is not ok 22286 1726882815.59930: done checking to see if all hosts have failed 22286 1726882815.59931: getting the remaining hosts for this loop 22286 1726882815.59932: done getting the remaining hosts for this loop 22286 1726882815.59937: getting the next task for host managed_node3 22286 1726882815.59942: done getting next task for host managed_node3 22286 1726882815.59945: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 22286 1726882815.59948: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22286 1726882815.59952: getting variables 22286 1726882815.59953: in VariableManager get_vars() 22286 1726882815.59970: Calling all_inventory to load vars for managed_node3 22286 1726882815.59973: Calling groups_inventory to load vars for managed_node3 22286 1726882815.59978: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882815.59985: Calling all_plugins_play to load vars for managed_node3 22286 1726882815.59988: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882815.59992: Calling groups_plugins_play to load vars for managed_node3 22286 1726882815.62070: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882815.65913: done with get_vars() 22286 1726882815.65949: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 21:40:15 -0400 (0:00:00.196) 0:00:39.056 ****** 22286 1726882815.66355: entering _queue_task() for managed_node3/include_tasks 22286 1726882815.66814: worker is 1 (out of 1 available) 22286 1726882815.66826: exiting _queue_task() for managed_node3/include_tasks 22286 1726882815.67041: done queuing things up, now waiting for results queue to drain 22286 1726882815.67043: waiting for pending results... 22286 1726882815.67180: running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' 22286 1726882815.67384: in run() - task 0affe814-3a2d-a75d-4836-0000000006e4 22286 1726882815.67388: variable 'ansible_search_path' from source: unknown 22286 1726882815.67391: variable 'ansible_search_path' from source: unknown 22286 1726882815.67394: calling self._execute() 22286 1726882815.67502: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882815.67518: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882815.67540: variable 'omit' from source: magic vars 22286 1726882815.68014: variable 'ansible_distribution_major_version' from source: facts 22286 1726882815.68038: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882815.68052: _execute() done 22286 1726882815.68062: dumping result to json 22286 1726882815.68071: done dumping result, returning 22286 1726882815.68088: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' [0affe814-3a2d-a75d-4836-0000000006e4] 22286 1726882815.68100: sending task result for task 0affe814-3a2d-a75d-4836-0000000006e4 22286 1726882815.68316: done sending task result for task 0affe814-3a2d-a75d-4836-0000000006e4 22286 1726882815.68321: WORKER PROCESS EXITING 22286 1726882815.68353: no more pending results, returning what we have 22286 1726882815.68358: in VariableManager get_vars() 22286 1726882815.68416: Calling all_inventory to load vars for managed_node3 22286 1726882815.68420: Calling groups_inventory to load vars for managed_node3 22286 1726882815.68424: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882815.68442: Calling all_plugins_play to load vars for managed_node3 22286 1726882815.68446: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882815.68451: Calling groups_plugins_play to load vars for managed_node3 22286 1726882815.72171: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882815.76516: done with get_vars() 22286 1726882815.76550: variable 'ansible_search_path' from source: unknown 22286 1726882815.76552: variable 'ansible_search_path' from source: unknown 22286 1726882815.76628: we have included files to process 22286 1726882815.76630: generating all_blocks data 22286 1726882815.76632: done generating all_blocks data 22286 1726882815.76636: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 22286 1726882815.76637: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 22286 1726882815.76640: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 22286 1726882815.77391: done processing included file 22286 1726882815.77394: iterating over new_blocks loaded from include file 22286 1726882815.77396: in VariableManager get_vars() 22286 1726882815.77425: done with get_vars() 22286 1726882815.77428: filtering new block on tags 22286 1726882815.77456: done filtering new block on tags 22286 1726882815.77459: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node3 22286 1726882815.77466: extending task lists for all hosts with included blocks 22286 1726882815.77894: done extending task lists 22286 1726882815.77896: done processing included files 22286 1726882815.77897: results queue empty 22286 1726882815.77898: checking for any_errors_fatal 22286 1726882815.77903: done checking for any_errors_fatal 22286 1726882815.77904: checking for max_fail_percentage 22286 1726882815.77905: done checking for max_fail_percentage 22286 1726882815.77907: checking to see if all hosts have failed and the running result is not ok 22286 1726882815.77908: done checking to see if all hosts have failed 22286 1726882815.77909: getting the remaining hosts for this loop 22286 1726882815.77910: done getting the remaining hosts for this loop 22286 1726882815.77914: getting the next task for host managed_node3 22286 1726882815.77920: done getting next task for host managed_node3 22286 1726882815.77922: ^ task is: TASK: Gather current interface info 22286 1726882815.77927: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22286 1726882815.77930: getting variables 22286 1726882815.77931: in VariableManager get_vars() 22286 1726882815.78154: Calling all_inventory to load vars for managed_node3 22286 1726882815.78157: Calling groups_inventory to load vars for managed_node3 22286 1726882815.78160: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882815.78167: Calling all_plugins_play to load vars for managed_node3 22286 1726882815.78171: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882815.78175: Calling groups_plugins_play to load vars for managed_node3 22286 1726882815.82510: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882815.88153: done with get_vars() 22286 1726882815.88197: done getting variables 22286 1726882815.88373: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 21:40:15 -0400 (0:00:00.220) 0:00:39.277 ****** 22286 1726882815.88417: entering _queue_task() for managed_node3/command 22286 1726882815.89225: worker is 1 (out of 1 available) 22286 1726882815.89243: exiting _queue_task() for managed_node3/command 22286 1726882815.89259: done queuing things up, now waiting for results queue to drain 22286 1726882815.89261: waiting for pending results... 22286 1726882815.90091: running TaskExecutor() for managed_node3/TASK: Gather current interface info 22286 1726882815.90737: in run() - task 0affe814-3a2d-a75d-4836-00000000071b 22286 1726882815.90742: variable 'ansible_search_path' from source: unknown 22286 1726882815.90745: variable 'ansible_search_path' from source: unknown 22286 1726882815.90748: calling self._execute() 22286 1726882815.91024: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882815.91031: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882815.91047: variable 'omit' from source: magic vars 22286 1726882815.92699: variable 'ansible_distribution_major_version' from source: facts 22286 1726882815.92714: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882815.92723: variable 'omit' from source: magic vars 22286 1726882815.92942: variable 'omit' from source: magic vars 22286 1726882815.92986: variable 'omit' from source: magic vars 22286 1726882815.93469: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22286 1726882815.93473: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22286 1726882815.93478: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22286 1726882815.93481: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882815.93552: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882815.93593: variable 'inventory_hostname' from source: host vars for 'managed_node3' 22286 1726882815.93596: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882815.93601: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882815.93989: Set connection var ansible_shell_executable to /bin/sh 22286 1726882815.94224: Set connection var ansible_module_compression to ZIP_DEFLATED 22286 1726882815.94227: Set connection var ansible_connection to ssh 22286 1726882815.94230: Set connection var ansible_shell_type to sh 22286 1726882815.94239: Set connection var ansible_timeout to 10 22286 1726882815.94250: Set connection var ansible_pipelining to False 22286 1726882815.94280: variable 'ansible_shell_executable' from source: unknown 22286 1726882815.94284: variable 'ansible_connection' from source: unknown 22286 1726882815.94287: variable 'ansible_module_compression' from source: unknown 22286 1726882815.94290: variable 'ansible_shell_type' from source: unknown 22286 1726882815.94292: variable 'ansible_shell_executable' from source: unknown 22286 1726882815.94295: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882815.94300: variable 'ansible_pipelining' from source: unknown 22286 1726882815.94303: variable 'ansible_timeout' from source: unknown 22286 1726882815.94310: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882815.94791: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22286 1726882815.94804: variable 'omit' from source: magic vars 22286 1726882815.94810: starting attempt loop 22286 1726882815.94814: running the handler 22286 1726882815.94833: _low_level_execute_command(): starting 22286 1726882815.94963: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22286 1726882815.96603: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882815.96707: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882815.96819: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882815.98706: stdout chunk (state=3): >>>/root <<< 22286 1726882815.98859: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882815.98925: stderr chunk (state=3): >>><<< 22286 1726882815.98971: stdout chunk (state=3): >>><<< 22286 1726882815.99028: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882815.99056: _low_level_execute_command(): starting 22286 1726882815.99062: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882815.9902775-23739-66005416357492 `" && echo ansible-tmp-1726882815.9902775-23739-66005416357492="` echo /root/.ansible/tmp/ansible-tmp-1726882815.9902775-23739-66005416357492 `" ) && sleep 0' 22286 1726882815.99995: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882816.00005: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882816.00050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882816.00121: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882816.00140: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882816.00173: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882816.00288: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882816.02578: stdout chunk (state=3): >>>ansible-tmp-1726882815.9902775-23739-66005416357492=/root/.ansible/tmp/ansible-tmp-1726882815.9902775-23739-66005416357492 <<< 22286 1726882816.02917: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882816.02920: stdout chunk (state=3): >>><<< 22286 1726882816.02926: stderr chunk (state=3): >>><<< 22286 1726882816.02929: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882815.9902775-23739-66005416357492=/root/.ansible/tmp/ansible-tmp-1726882815.9902775-23739-66005416357492 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882816.02932: variable 'ansible_module_compression' from source: unknown 22286 1726882816.02980: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22286nvkga28q/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 22286 1726882816.03017: variable 'ansible_facts' from source: unknown 22286 1726882816.03308: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882815.9902775-23739-66005416357492/AnsiballZ_command.py 22286 1726882816.03952: Sending initial data 22286 1726882816.03955: Sent initial data (155 bytes) 22286 1726882816.04867: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882816.05142: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882816.05353: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882816.05512: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882816.07383: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22286 1726882816.07560: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22286 1726882816.07702: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22286nvkga28q/tmp8ztyyv7y /root/.ansible/tmp/ansible-tmp-1726882815.9902775-23739-66005416357492/AnsiballZ_command.py <<< 22286 1726882816.07707: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882815.9902775-23739-66005416357492/AnsiballZ_command.py" <<< 22286 1726882816.07901: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-22286nvkga28q/tmp8ztyyv7y" to remote "/root/.ansible/tmp/ansible-tmp-1726882815.9902775-23739-66005416357492/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882815.9902775-23739-66005416357492/AnsiballZ_command.py" <<< 22286 1726882816.10242: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882816.10281: stderr chunk (state=3): >>><<< 22286 1726882816.10285: stdout chunk (state=3): >>><<< 22286 1726882816.10315: done transferring module to remote 22286 1726882816.10328: _low_level_execute_command(): starting 22286 1726882816.10334: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882815.9902775-23739-66005416357492/ /root/.ansible/tmp/ansible-tmp-1726882815.9902775-23739-66005416357492/AnsiballZ_command.py && sleep 0' 22286 1726882816.11124: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882816.11156: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882816.11301: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882816.13439: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882816.13640: stdout chunk (state=3): >>><<< 22286 1726882816.13644: stderr chunk (state=3): >>><<< 22286 1726882816.13647: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882816.13654: _low_level_execute_command(): starting 22286 1726882816.13657: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882815.9902775-23739-66005416357492/AnsiballZ_command.py && sleep 0' 22286 1726882816.14100: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882816.14113: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882816.14150: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882816.14159: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882816.14206: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882816.14271: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882816.14283: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882816.14330: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882816.14456: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882816.33053: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo\nveth0", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:40:16.324845", "end": "2024-09-20 21:40:16.328546", "delta": "0:00:00.003701", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 22286 1726882816.34875: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. <<< 22286 1726882816.34935: stderr chunk (state=3): >>><<< 22286 1726882816.34940: stdout chunk (state=3): >>><<< 22286 1726882816.34956: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo\nveth0", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:40:16.324845", "end": "2024-09-20 21:40:16.328546", "delta": "0:00:00.003701", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. 22286 1726882816.34994: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882815.9902775-23739-66005416357492/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22286 1726882816.35005: _low_level_execute_command(): starting 22286 1726882816.35012: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882815.9902775-23739-66005416357492/ > /dev/null 2>&1 && sleep 0' 22286 1726882816.35439: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882816.35477: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 <<< 22286 1726882816.35481: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration <<< 22286 1726882816.35484: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882816.35533: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882816.35541: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882816.35668: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882816.37770: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882816.37823: stderr chunk (state=3): >>><<< 22286 1726882816.37826: stdout chunk (state=3): >>><<< 22286 1726882816.37843: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882816.37856: handler run complete 22286 1726882816.37897: Evaluated conditional (False): False 22286 1726882816.37901: attempt loop complete, returning result 22286 1726882816.37903: _execute() done 22286 1726882816.37906: dumping result to json 22286 1726882816.37918: done dumping result, returning 22286 1726882816.37925: done running TaskExecutor() for managed_node3/TASK: Gather current interface info [0affe814-3a2d-a75d-4836-00000000071b] 22286 1726882816.37928: sending task result for task 0affe814-3a2d-a75d-4836-00000000071b 22286 1726882816.38079: done sending task result for task 0affe814-3a2d-a75d-4836-00000000071b 22286 1726882816.38082: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003701", "end": "2024-09-20 21:40:16.328546", "rc": 0, "start": "2024-09-20 21:40:16.324845" } STDOUT: bonding_masters eth0 lo veth0 22286 1726882816.38213: no more pending results, returning what we have 22286 1726882816.38216: results queue empty 22286 1726882816.38217: checking for any_errors_fatal 22286 1726882816.38219: done checking for any_errors_fatal 22286 1726882816.38220: checking for max_fail_percentage 22286 1726882816.38222: done checking for max_fail_percentage 22286 1726882816.38223: checking to see if all hosts have failed and the running result is not ok 22286 1726882816.38224: done checking to see if all hosts have failed 22286 1726882816.38225: getting the remaining hosts for this loop 22286 1726882816.38226: done getting the remaining hosts for this loop 22286 1726882816.38231: getting the next task for host managed_node3 22286 1726882816.38240: done getting next task for host managed_node3 22286 1726882816.38243: ^ task is: TASK: Set current_interfaces 22286 1726882816.38248: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22286 1726882816.38253: getting variables 22286 1726882816.38254: in VariableManager get_vars() 22286 1726882816.38297: Calling all_inventory to load vars for managed_node3 22286 1726882816.38300: Calling groups_inventory to load vars for managed_node3 22286 1726882816.38303: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882816.38314: Calling all_plugins_play to load vars for managed_node3 22286 1726882816.38317: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882816.38320: Calling groups_plugins_play to load vars for managed_node3 22286 1726882816.40364: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882816.42710: done with get_vars() 22286 1726882816.42745: done getting variables 22286 1726882816.42813: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 21:40:16 -0400 (0:00:00.544) 0:00:39.821 ****** 22286 1726882816.42844: entering _queue_task() for managed_node3/set_fact 22286 1726882816.43218: worker is 1 (out of 1 available) 22286 1726882816.43231: exiting _queue_task() for managed_node3/set_fact 22286 1726882816.43252: done queuing things up, now waiting for results queue to drain 22286 1726882816.43254: waiting for pending results... 22286 1726882816.43708: running TaskExecutor() for managed_node3/TASK: Set current_interfaces 22286 1726882816.43895: in run() - task 0affe814-3a2d-a75d-4836-00000000071c 22286 1726882816.43925: variable 'ansible_search_path' from source: unknown 22286 1726882816.43933: variable 'ansible_search_path' from source: unknown 22286 1726882816.44197: calling self._execute() 22286 1726882816.44371: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882816.44392: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882816.44396: variable 'omit' from source: magic vars 22286 1726882816.44855: variable 'ansible_distribution_major_version' from source: facts 22286 1726882816.44878: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882816.44886: variable 'omit' from source: magic vars 22286 1726882816.44939: variable 'omit' from source: magic vars 22286 1726882816.45031: variable '_current_interfaces' from source: set_fact 22286 1726882816.45090: variable 'omit' from source: magic vars 22286 1726882816.45125: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22286 1726882816.45159: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22286 1726882816.45182: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22286 1726882816.45199: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882816.45211: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882816.45242: variable 'inventory_hostname' from source: host vars for 'managed_node3' 22286 1726882816.45245: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882816.45248: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882816.45339: Set connection var ansible_shell_executable to /bin/sh 22286 1726882816.45348: Set connection var ansible_module_compression to ZIP_DEFLATED 22286 1726882816.45351: Set connection var ansible_connection to ssh 22286 1726882816.45354: Set connection var ansible_shell_type to sh 22286 1726882816.45361: Set connection var ansible_timeout to 10 22286 1726882816.45371: Set connection var ansible_pipelining to False 22286 1726882816.45397: variable 'ansible_shell_executable' from source: unknown 22286 1726882816.45400: variable 'ansible_connection' from source: unknown 22286 1726882816.45403: variable 'ansible_module_compression' from source: unknown 22286 1726882816.45406: variable 'ansible_shell_type' from source: unknown 22286 1726882816.45409: variable 'ansible_shell_executable' from source: unknown 22286 1726882816.45411: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882816.45415: variable 'ansible_pipelining' from source: unknown 22286 1726882816.45418: variable 'ansible_timeout' from source: unknown 22286 1726882816.45424: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882816.45545: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22286 1726882816.45554: variable 'omit' from source: magic vars 22286 1726882816.45560: starting attempt loop 22286 1726882816.45564: running the handler 22286 1726882816.45577: handler run complete 22286 1726882816.45589: attempt loop complete, returning result 22286 1726882816.45592: _execute() done 22286 1726882816.45597: dumping result to json 22286 1726882816.45599: done dumping result, returning 22286 1726882816.45602: done running TaskExecutor() for managed_node3/TASK: Set current_interfaces [0affe814-3a2d-a75d-4836-00000000071c] 22286 1726882816.45614: sending task result for task 0affe814-3a2d-a75d-4836-00000000071c 22286 1726882816.45701: done sending task result for task 0affe814-3a2d-a75d-4836-00000000071c 22286 1726882816.45704: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo", "veth0" ] }, "changed": false } 22286 1726882816.45783: no more pending results, returning what we have 22286 1726882816.45786: results queue empty 22286 1726882816.45787: checking for any_errors_fatal 22286 1726882816.45797: done checking for any_errors_fatal 22286 1726882816.45798: checking for max_fail_percentage 22286 1726882816.45800: done checking for max_fail_percentage 22286 1726882816.45801: checking to see if all hosts have failed and the running result is not ok 22286 1726882816.45802: done checking to see if all hosts have failed 22286 1726882816.45803: getting the remaining hosts for this loop 22286 1726882816.45805: done getting the remaining hosts for this loop 22286 1726882816.45809: getting the next task for host managed_node3 22286 1726882816.45820: done getting next task for host managed_node3 22286 1726882816.45823: ^ task is: TASK: Show current_interfaces 22286 1726882816.45829: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22286 1726882816.45833: getting variables 22286 1726882816.45837: in VariableManager get_vars() 22286 1726882816.45874: Calling all_inventory to load vars for managed_node3 22286 1726882816.45880: Calling groups_inventory to load vars for managed_node3 22286 1726882816.45882: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882816.45893: Calling all_plugins_play to load vars for managed_node3 22286 1726882816.45895: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882816.45899: Calling groups_plugins_play to load vars for managed_node3 22286 1726882816.47664: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882816.52393: done with get_vars() 22286 1726882816.52548: done getting variables 22286 1726882816.52628: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 21:40:16 -0400 (0:00:00.098) 0:00:39.920 ****** 22286 1726882816.52706: entering _queue_task() for managed_node3/debug 22286 1726882816.53585: worker is 1 (out of 1 available) 22286 1726882816.53600: exiting _queue_task() for managed_node3/debug 22286 1726882816.53612: done queuing things up, now waiting for results queue to drain 22286 1726882816.53614: waiting for pending results... 22286 1726882816.54139: running TaskExecutor() for managed_node3/TASK: Show current_interfaces 22286 1726882816.54243: in run() - task 0affe814-3a2d-a75d-4836-0000000006e5 22286 1726882816.54271: variable 'ansible_search_path' from source: unknown 22286 1726882816.54288: variable 'ansible_search_path' from source: unknown 22286 1726882816.54337: calling self._execute() 22286 1726882816.54615: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882816.54619: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882816.54622: variable 'omit' from source: magic vars 22286 1726882816.55323: variable 'ansible_distribution_major_version' from source: facts 22286 1726882816.55345: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882816.55385: variable 'omit' from source: magic vars 22286 1726882816.55445: variable 'omit' from source: magic vars 22286 1726882816.55600: variable 'current_interfaces' from source: set_fact 22286 1726882816.55710: variable 'omit' from source: magic vars 22286 1726882816.55767: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22286 1726882816.55826: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22286 1726882816.55860: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22286 1726882816.55888: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882816.55907: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882816.56038: variable 'inventory_hostname' from source: host vars for 'managed_node3' 22286 1726882816.56043: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882816.56046: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882816.56120: Set connection var ansible_shell_executable to /bin/sh 22286 1726882816.56143: Set connection var ansible_module_compression to ZIP_DEFLATED 22286 1726882816.56152: Set connection var ansible_connection to ssh 22286 1726882816.56163: Set connection var ansible_shell_type to sh 22286 1726882816.56179: Set connection var ansible_timeout to 10 22286 1726882816.56195: Set connection var ansible_pipelining to False 22286 1726882816.56224: variable 'ansible_shell_executable' from source: unknown 22286 1726882816.56233: variable 'ansible_connection' from source: unknown 22286 1726882816.56243: variable 'ansible_module_compression' from source: unknown 22286 1726882816.56255: variable 'ansible_shell_type' from source: unknown 22286 1726882816.56340: variable 'ansible_shell_executable' from source: unknown 22286 1726882816.56343: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882816.56346: variable 'ansible_pipelining' from source: unknown 22286 1726882816.56348: variable 'ansible_timeout' from source: unknown 22286 1726882816.56350: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882816.56480: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22286 1726882816.56503: variable 'omit' from source: magic vars 22286 1726882816.56515: starting attempt loop 22286 1726882816.56523: running the handler 22286 1726882816.56596: handler run complete 22286 1726882816.56690: attempt loop complete, returning result 22286 1726882816.56693: _execute() done 22286 1726882816.56696: dumping result to json 22286 1726882816.56699: done dumping result, returning 22286 1726882816.56701: done running TaskExecutor() for managed_node3/TASK: Show current_interfaces [0affe814-3a2d-a75d-4836-0000000006e5] 22286 1726882816.56703: sending task result for task 0affe814-3a2d-a75d-4836-0000000006e5 22286 1726882816.56772: done sending task result for task 0affe814-3a2d-a75d-4836-0000000006e5 22286 1726882816.56778: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo', 'veth0'] 22286 1726882816.56839: no more pending results, returning what we have 22286 1726882816.56843: results queue empty 22286 1726882816.56844: checking for any_errors_fatal 22286 1726882816.56852: done checking for any_errors_fatal 22286 1726882816.56853: checking for max_fail_percentage 22286 1726882816.56856: done checking for max_fail_percentage 22286 1726882816.56857: checking to see if all hosts have failed and the running result is not ok 22286 1726882816.56858: done checking to see if all hosts have failed 22286 1726882816.56859: getting the remaining hosts for this loop 22286 1726882816.56861: done getting the remaining hosts for this loop 22286 1726882816.56866: getting the next task for host managed_node3 22286 1726882816.56880: done getting next task for host managed_node3 22286 1726882816.56885: ^ task is: TASK: Install iproute 22286 1726882816.56889: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22286 1726882816.56894: getting variables 22286 1726882816.56896: in VariableManager get_vars() 22286 1726882816.56957: Calling all_inventory to load vars for managed_node3 22286 1726882816.56960: Calling groups_inventory to load vars for managed_node3 22286 1726882816.56964: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882816.56980: Calling all_plugins_play to load vars for managed_node3 22286 1726882816.56985: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882816.56989: Calling groups_plugins_play to load vars for managed_node3 22286 1726882816.60254: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882816.63910: done with get_vars() 22286 1726882816.64016: done getting variables 22286 1726882816.64093: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Friday 20 September 2024 21:40:16 -0400 (0:00:00.114) 0:00:40.034 ****** 22286 1726882816.64129: entering _queue_task() for managed_node3/package 22286 1726882816.64668: worker is 1 (out of 1 available) 22286 1726882816.64686: exiting _queue_task() for managed_node3/package 22286 1726882816.64704: done queuing things up, now waiting for results queue to drain 22286 1726882816.64741: waiting for pending results... 22286 1726882816.65186: running TaskExecutor() for managed_node3/TASK: Install iproute 22286 1726882816.65225: in run() - task 0affe814-3a2d-a75d-4836-0000000005cf 22286 1726882816.65367: variable 'ansible_search_path' from source: unknown 22286 1726882816.65371: variable 'ansible_search_path' from source: unknown 22286 1726882816.65426: calling self._execute() 22286 1726882816.65617: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882816.65711: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882816.65715: variable 'omit' from source: magic vars 22286 1726882816.66582: variable 'ansible_distribution_major_version' from source: facts 22286 1726882816.66586: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882816.66589: variable 'omit' from source: magic vars 22286 1726882816.66654: variable 'omit' from source: magic vars 22286 1726882816.67174: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22286 1726882816.70149: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22286 1726882816.70240: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22286 1726882816.70297: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22286 1726882816.70768: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22286 1726882816.70808: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22286 1726882816.70943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22286 1726882816.71020: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22286 1726882816.71069: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22286 1726882816.71129: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22286 1726882816.71262: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22286 1726882816.71296: variable '__network_is_ostree' from source: set_fact 22286 1726882816.71308: variable 'omit' from source: magic vars 22286 1726882816.71350: variable 'omit' from source: magic vars 22286 1726882816.71398: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22286 1726882816.71438: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22286 1726882816.71466: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22286 1726882816.71504: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882816.71523: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882816.71567: variable 'inventory_hostname' from source: host vars for 'managed_node3' 22286 1726882816.71584: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882816.71599: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882816.71748: Set connection var ansible_shell_executable to /bin/sh 22286 1726882816.71765: Set connection var ansible_module_compression to ZIP_DEFLATED 22286 1726882816.71773: Set connection var ansible_connection to ssh 22286 1726882816.71783: Set connection var ansible_shell_type to sh 22286 1726882816.71795: Set connection var ansible_timeout to 10 22286 1726882816.71818: Set connection var ansible_pipelining to False 22286 1726882816.71916: variable 'ansible_shell_executable' from source: unknown 22286 1726882816.71919: variable 'ansible_connection' from source: unknown 22286 1726882816.71922: variable 'ansible_module_compression' from source: unknown 22286 1726882816.71924: variable 'ansible_shell_type' from source: unknown 22286 1726882816.71926: variable 'ansible_shell_executable' from source: unknown 22286 1726882816.71928: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882816.71930: variable 'ansible_pipelining' from source: unknown 22286 1726882816.71932: variable 'ansible_timeout' from source: unknown 22286 1726882816.71936: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882816.72038: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22286 1726882816.72057: variable 'omit' from source: magic vars 22286 1726882816.72068: starting attempt loop 22286 1726882816.72078: running the handler 22286 1726882816.72092: variable 'ansible_facts' from source: unknown 22286 1726882816.72100: variable 'ansible_facts' from source: unknown 22286 1726882816.72155: _low_level_execute_command(): starting 22286 1726882816.72243: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22286 1726882816.73025: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22286 1726882816.73109: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882816.73166: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882816.73274: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882816.75136: stdout chunk (state=3): >>>/root <<< 22286 1726882816.75342: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882816.75345: stdout chunk (state=3): >>><<< 22286 1726882816.75348: stderr chunk (state=3): >>><<< 22286 1726882816.75365: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882816.75383: _low_level_execute_command(): starting 22286 1726882816.75413: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882816.7536528-23771-44846517092410 `" && echo ansible-tmp-1726882816.7536528-23771-44846517092410="` echo /root/.ansible/tmp/ansible-tmp-1726882816.7536528-23771-44846517092410 `" ) && sleep 0' 22286 1726882816.75963: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882816.75966: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882816.75969: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882816.75971: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882816.76038: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882816.76158: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882816.78352: stdout chunk (state=3): >>>ansible-tmp-1726882816.7536528-23771-44846517092410=/root/.ansible/tmp/ansible-tmp-1726882816.7536528-23771-44846517092410 <<< 22286 1726882816.78468: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882816.78512: stderr chunk (state=3): >>><<< 22286 1726882816.78515: stdout chunk (state=3): >>><<< 22286 1726882816.78545: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882816.7536528-23771-44846517092410=/root/.ansible/tmp/ansible-tmp-1726882816.7536528-23771-44846517092410 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882816.78572: variable 'ansible_module_compression' from source: unknown 22286 1726882816.78626: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22286nvkga28q/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 22286 1726882816.78664: variable 'ansible_facts' from source: unknown 22286 1726882816.78763: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882816.7536528-23771-44846517092410/AnsiballZ_dnf.py 22286 1726882816.78890: Sending initial data 22286 1726882816.78894: Sent initial data (151 bytes) 22286 1726882816.79368: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882816.79371: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882816.79374: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882816.79376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882816.79454: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882816.79567: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882816.81377: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 22286 1726882816.81389: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22286 1726882816.81495: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22286 1726882816.81659: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22286nvkga28q/tmpoxrwcmih /root/.ansible/tmp/ansible-tmp-1726882816.7536528-23771-44846517092410/AnsiballZ_dnf.py <<< 22286 1726882816.81662: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882816.7536528-23771-44846517092410/AnsiballZ_dnf.py" <<< 22286 1726882816.81856: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-22286nvkga28q/tmpoxrwcmih" to remote "/root/.ansible/tmp/ansible-tmp-1726882816.7536528-23771-44846517092410/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882816.7536528-23771-44846517092410/AnsiballZ_dnf.py" <<< 22286 1726882816.83529: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882816.83593: stderr chunk (state=3): >>><<< 22286 1726882816.83640: stdout chunk (state=3): >>><<< 22286 1726882816.83643: done transferring module to remote 22286 1726882816.83646: _low_level_execute_command(): starting 22286 1726882816.83648: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882816.7536528-23771-44846517092410/ /root/.ansible/tmp/ansible-tmp-1726882816.7536528-23771-44846517092410/AnsiballZ_dnf.py && sleep 0' 22286 1726882816.84069: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882816.84072: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found <<< 22286 1726882816.84078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration <<< 22286 1726882816.84080: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found <<< 22286 1726882816.84083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882816.84153: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882816.84159: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882816.84179: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882816.84300: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882816.86359: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882816.86395: stderr chunk (state=3): >>><<< 22286 1726882816.86399: stdout chunk (state=3): >>><<< 22286 1726882816.86411: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882816.86414: _low_level_execute_command(): starting 22286 1726882816.86422: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882816.7536528-23771-44846517092410/AnsiballZ_dnf.py && sleep 0' 22286 1726882816.86844: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882816.86888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882816.86892: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882816.86932: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882816.86940: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882816.87062: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882818.40059: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 22286 1726882818.45642: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. <<< 22286 1726882818.45648: stdout chunk (state=3): >>><<< 22286 1726882818.45651: stderr chunk (state=3): >>><<< 22286 1726882818.45655: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. 22286 1726882818.45659: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882816.7536528-23771-44846517092410/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22286 1726882818.45672: _low_level_execute_command(): starting 22286 1726882818.45678: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882816.7536528-23771-44846517092410/ > /dev/null 2>&1 && sleep 0' 22286 1726882818.46592: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882818.46705: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882818.46937: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882818.47021: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882818.47270: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882818.49388: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882818.49392: stdout chunk (state=3): >>><<< 22286 1726882818.49400: stderr chunk (state=3): >>><<< 22286 1726882818.49443: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882818.49452: handler run complete 22286 1726882818.49713: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22286 1726882818.50029: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22286 1726882818.50128: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22286 1726882818.50168: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22286 1726882818.50340: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22286 1726882818.50658: variable '__install_status' from source: set_fact 22286 1726882818.50686: Evaluated conditional (__install_status is success): True 22286 1726882818.50708: attempt loop complete, returning result 22286 1726882818.50712: _execute() done 22286 1726882818.50715: dumping result to json 22286 1726882818.50724: done dumping result, returning 22286 1726882818.50733: done running TaskExecutor() for managed_node3/TASK: Install iproute [0affe814-3a2d-a75d-4836-0000000005cf] 22286 1726882818.50741: sending task result for task 0affe814-3a2d-a75d-4836-0000000005cf 22286 1726882818.50893: done sending task result for task 0affe814-3a2d-a75d-4836-0000000005cf 22286 1726882818.50897: WORKER PROCESS EXITING ok: [managed_node3] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 22286 1726882818.51063: no more pending results, returning what we have 22286 1726882818.51067: results queue empty 22286 1726882818.51068: checking for any_errors_fatal 22286 1726882818.51077: done checking for any_errors_fatal 22286 1726882818.51079: checking for max_fail_percentage 22286 1726882818.51081: done checking for max_fail_percentage 22286 1726882818.51082: checking to see if all hosts have failed and the running result is not ok 22286 1726882818.51083: done checking to see if all hosts have failed 22286 1726882818.51084: getting the remaining hosts for this loop 22286 1726882818.51086: done getting the remaining hosts for this loop 22286 1726882818.51091: getting the next task for host managed_node3 22286 1726882818.51099: done getting next task for host managed_node3 22286 1726882818.51103: ^ task is: TASK: Create veth interface {{ interface }} 22286 1726882818.51107: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22286 1726882818.51111: getting variables 22286 1726882818.51113: in VariableManager get_vars() 22286 1726882818.51285: Calling all_inventory to load vars for managed_node3 22286 1726882818.51288: Calling groups_inventory to load vars for managed_node3 22286 1726882818.51291: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882818.51305: Calling all_plugins_play to load vars for managed_node3 22286 1726882818.51308: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882818.51312: Calling groups_plugins_play to load vars for managed_node3 22286 1726882818.54439: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882818.56558: done with get_vars() 22286 1726882818.56593: done getting variables 22286 1726882818.56666: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 22286 1726882818.56797: variable 'interface' from source: play vars TASK [Create veth interface veth0] ********************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Friday 20 September 2024 21:40:18 -0400 (0:00:01.927) 0:00:41.961 ****** 22286 1726882818.56832: entering _queue_task() for managed_node3/command 22286 1726882818.57458: worker is 1 (out of 1 available) 22286 1726882818.57470: exiting _queue_task() for managed_node3/command 22286 1726882818.57485: done queuing things up, now waiting for results queue to drain 22286 1726882818.57486: waiting for pending results... 22286 1726882818.58261: running TaskExecutor() for managed_node3/TASK: Create veth interface veth0 22286 1726882818.58490: in run() - task 0affe814-3a2d-a75d-4836-0000000005d0 22286 1726882818.58504: variable 'ansible_search_path' from source: unknown 22286 1726882818.58508: variable 'ansible_search_path' from source: unknown 22286 1726882818.58926: variable 'interface' from source: play vars 22286 1726882818.58998: variable 'interface' from source: play vars 22286 1726882818.59101: variable 'interface' from source: play vars 22286 1726882818.59300: Loaded config def from plugin (lookup/items) 22286 1726882818.59314: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 22286 1726882818.59351: variable 'omit' from source: magic vars 22286 1726882818.59514: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882818.59532: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882818.59576: variable 'omit' from source: magic vars 22286 1726882818.59865: variable 'ansible_distribution_major_version' from source: facts 22286 1726882818.59880: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882818.60275: variable 'type' from source: play vars 22286 1726882818.60279: variable 'state' from source: include params 22286 1726882818.60281: variable 'interface' from source: play vars 22286 1726882818.60283: variable 'current_interfaces' from source: set_fact 22286 1726882818.60285: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): False 22286 1726882818.60288: when evaluation is False, skipping this task 22286 1726882818.60290: variable 'item' from source: unknown 22286 1726882818.60468: variable 'item' from source: unknown skipping: [managed_node3] => (item=ip link add veth0 type veth peer name peerveth0) => { "ansible_loop_var": "item", "changed": false, "false_condition": "type == 'veth' and state == 'present' and interface not in current_interfaces", "item": "ip link add veth0 type veth peer name peerveth0", "skip_reason": "Conditional result was False" } 22286 1726882818.61290: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882818.61294: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882818.61296: variable 'omit' from source: magic vars 22286 1726882818.61367: variable 'ansible_distribution_major_version' from source: facts 22286 1726882818.61574: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882818.62118: variable 'type' from source: play vars 22286 1726882818.62123: variable 'state' from source: include params 22286 1726882818.62126: variable 'interface' from source: play vars 22286 1726882818.62128: variable 'current_interfaces' from source: set_fact 22286 1726882818.62130: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): False 22286 1726882818.62133: when evaluation is False, skipping this task 22286 1726882818.62137: variable 'item' from source: unknown 22286 1726882818.62246: variable 'item' from source: unknown skipping: [managed_node3] => (item=ip link set peerveth0 up) => { "ansible_loop_var": "item", "changed": false, "false_condition": "type == 'veth' and state == 'present' and interface not in current_interfaces", "item": "ip link set peerveth0 up", "skip_reason": "Conditional result was False" } 22286 1726882818.62540: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882818.62543: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882818.62546: variable 'omit' from source: magic vars 22286 1726882818.62938: variable 'ansible_distribution_major_version' from source: facts 22286 1726882818.62953: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882818.63487: variable 'type' from source: play vars 22286 1726882818.63503: variable 'state' from source: include params 22286 1726882818.63512: variable 'interface' from source: play vars 22286 1726882818.63551: variable 'current_interfaces' from source: set_fact 22286 1726882818.63555: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): False 22286 1726882818.63557: when evaluation is False, skipping this task 22286 1726882818.63588: variable 'item' from source: unknown 22286 1726882818.63672: variable 'item' from source: unknown skipping: [managed_node3] => (item=ip link set veth0 up) => { "ansible_loop_var": "item", "changed": false, "false_condition": "type == 'veth' and state == 'present' and interface not in current_interfaces", "item": "ip link set veth0 up", "skip_reason": "Conditional result was False" } 22286 1726882818.63877: dumping result to json 22286 1726882818.63881: done dumping result, returning 22286 1726882818.63884: done running TaskExecutor() for managed_node3/TASK: Create veth interface veth0 [0affe814-3a2d-a75d-4836-0000000005d0] 22286 1726882818.63887: sending task result for task 0affe814-3a2d-a75d-4836-0000000005d0 22286 1726882818.63937: done sending task result for task 0affe814-3a2d-a75d-4836-0000000005d0 22286 1726882818.63940: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false } MSG: All items skipped 22286 1726882818.64033: no more pending results, returning what we have 22286 1726882818.64039: results queue empty 22286 1726882818.64041: checking for any_errors_fatal 22286 1726882818.64055: done checking for any_errors_fatal 22286 1726882818.64056: checking for max_fail_percentage 22286 1726882818.64059: done checking for max_fail_percentage 22286 1726882818.64060: checking to see if all hosts have failed and the running result is not ok 22286 1726882818.64061: done checking to see if all hosts have failed 22286 1726882818.64062: getting the remaining hosts for this loop 22286 1726882818.64065: done getting the remaining hosts for this loop 22286 1726882818.64070: getting the next task for host managed_node3 22286 1726882818.64077: done getting next task for host managed_node3 22286 1726882818.64081: ^ task is: TASK: Set up veth as managed by NetworkManager 22286 1726882818.64087: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22286 1726882818.64094: getting variables 22286 1726882818.64095: in VariableManager get_vars() 22286 1726882818.64167: Calling all_inventory to load vars for managed_node3 22286 1726882818.64170: Calling groups_inventory to load vars for managed_node3 22286 1726882818.64173: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882818.64186: Calling all_plugins_play to load vars for managed_node3 22286 1726882818.64189: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882818.64193: Calling groups_plugins_play to load vars for managed_node3 22286 1726882818.66491: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882818.69056: done with get_vars() 22286 1726882818.69097: done getting variables 22286 1726882818.69169: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Friday 20 September 2024 21:40:18 -0400 (0:00:00.123) 0:00:42.085 ****** 22286 1726882818.69212: entering _queue_task() for managed_node3/command 22286 1726882818.69587: worker is 1 (out of 1 available) 22286 1726882818.69601: exiting _queue_task() for managed_node3/command 22286 1726882818.69615: done queuing things up, now waiting for results queue to drain 22286 1726882818.69616: waiting for pending results... 22286 1726882818.70086: running TaskExecutor() for managed_node3/TASK: Set up veth as managed by NetworkManager 22286 1726882818.70093: in run() - task 0affe814-3a2d-a75d-4836-0000000005d1 22286 1726882818.70097: variable 'ansible_search_path' from source: unknown 22286 1726882818.70100: variable 'ansible_search_path' from source: unknown 22286 1726882818.70152: calling self._execute() 22286 1726882818.70272: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882818.70286: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882818.70296: variable 'omit' from source: magic vars 22286 1726882818.70771: variable 'ansible_distribution_major_version' from source: facts 22286 1726882818.70787: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882818.71016: variable 'type' from source: play vars 22286 1726882818.71052: variable 'state' from source: include params 22286 1726882818.71056: Evaluated conditional (type == 'veth' and state == 'present'): False 22286 1726882818.71059: when evaluation is False, skipping this task 22286 1726882818.71062: _execute() done 22286 1726882818.71064: dumping result to json 22286 1726882818.71067: done dumping result, returning 22286 1726882818.71069: done running TaskExecutor() for managed_node3/TASK: Set up veth as managed by NetworkManager [0affe814-3a2d-a75d-4836-0000000005d1] 22286 1726882818.71071: sending task result for task 0affe814-3a2d-a75d-4836-0000000005d1 skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'veth' and state == 'present'", "skip_reason": "Conditional result was False" } 22286 1726882818.71264: no more pending results, returning what we have 22286 1726882818.71269: results queue empty 22286 1726882818.71270: checking for any_errors_fatal 22286 1726882818.71291: done checking for any_errors_fatal 22286 1726882818.71293: checking for max_fail_percentage 22286 1726882818.71296: done checking for max_fail_percentage 22286 1726882818.71298: checking to see if all hosts have failed and the running result is not ok 22286 1726882818.71299: done checking to see if all hosts have failed 22286 1726882818.71300: getting the remaining hosts for this loop 22286 1726882818.71307: done getting the remaining hosts for this loop 22286 1726882818.71312: getting the next task for host managed_node3 22286 1726882818.71321: done getting next task for host managed_node3 22286 1726882818.71324: ^ task is: TASK: Delete veth interface {{ interface }} 22286 1726882818.71328: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22286 1726882818.71333: getting variables 22286 1726882818.71337: in VariableManager get_vars() 22286 1726882818.71388: Calling all_inventory to load vars for managed_node3 22286 1726882818.71391: Calling groups_inventory to load vars for managed_node3 22286 1726882818.71399: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882818.71628: done sending task result for task 0affe814-3a2d-a75d-4836-0000000005d1 22286 1726882818.71633: WORKER PROCESS EXITING 22286 1726882818.71645: Calling all_plugins_play to load vars for managed_node3 22286 1726882818.71649: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882818.71653: Calling groups_plugins_play to load vars for managed_node3 22286 1726882818.73840: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882818.76691: done with get_vars() 22286 1726882818.76727: done getting variables 22286 1726882818.76798: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 22286 1726882818.76928: variable 'interface' from source: play vars TASK [Delete veth interface veth0] ********************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Friday 20 September 2024 21:40:18 -0400 (0:00:00.077) 0:00:42.163 ****** 22286 1726882818.76966: entering _queue_task() for managed_node3/command 22286 1726882818.77306: worker is 1 (out of 1 available) 22286 1726882818.77433: exiting _queue_task() for managed_node3/command 22286 1726882818.77447: done queuing things up, now waiting for results queue to drain 22286 1726882818.77448: waiting for pending results... 22286 1726882818.77668: running TaskExecutor() for managed_node3/TASK: Delete veth interface veth0 22286 1726882818.77869: in run() - task 0affe814-3a2d-a75d-4836-0000000005d2 22286 1726882818.77873: variable 'ansible_search_path' from source: unknown 22286 1726882818.77876: variable 'ansible_search_path' from source: unknown 22286 1726882818.77879: calling self._execute() 22286 1726882818.77996: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882818.78010: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882818.78026: variable 'omit' from source: magic vars 22286 1726882818.78489: variable 'ansible_distribution_major_version' from source: facts 22286 1726882818.78510: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882818.78803: variable 'type' from source: play vars 22286 1726882818.78841: variable 'state' from source: include params 22286 1726882818.78845: variable 'interface' from source: play vars 22286 1726882818.78848: variable 'current_interfaces' from source: set_fact 22286 1726882818.78854: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): True 22286 1726882818.78919: variable 'omit' from source: magic vars 22286 1726882818.78923: variable 'omit' from source: magic vars 22286 1726882818.79061: variable 'interface' from source: play vars 22286 1726882818.79087: variable 'omit' from source: magic vars 22286 1726882818.79141: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22286 1726882818.79199: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22286 1726882818.79228: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22286 1726882818.79277: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882818.79285: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882818.79325: variable 'inventory_hostname' from source: host vars for 'managed_node3' 22286 1726882818.79372: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882818.79375: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882818.79506: Set connection var ansible_shell_executable to /bin/sh 22286 1726882818.79524: Set connection var ansible_module_compression to ZIP_DEFLATED 22286 1726882818.79532: Set connection var ansible_connection to ssh 22286 1726882818.79541: Set connection var ansible_shell_type to sh 22286 1726882818.79553: Set connection var ansible_timeout to 10 22286 1726882818.79566: Set connection var ansible_pipelining to False 22286 1726882818.79712: variable 'ansible_shell_executable' from source: unknown 22286 1726882818.79715: variable 'ansible_connection' from source: unknown 22286 1726882818.79718: variable 'ansible_module_compression' from source: unknown 22286 1726882818.79720: variable 'ansible_shell_type' from source: unknown 22286 1726882818.79722: variable 'ansible_shell_executable' from source: unknown 22286 1726882818.79724: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882818.79725: variable 'ansible_pipelining' from source: unknown 22286 1726882818.79727: variable 'ansible_timeout' from source: unknown 22286 1726882818.79729: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882818.79827: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22286 1726882818.79850: variable 'omit' from source: magic vars 22286 1726882818.79861: starting attempt loop 22286 1726882818.79867: running the handler 22286 1726882818.79888: _low_level_execute_command(): starting 22286 1726882818.79899: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22286 1726882818.80756: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882818.80832: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882818.80860: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882818.80876: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882818.81039: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882818.82912: stdout chunk (state=3): >>>/root <<< 22286 1726882818.83051: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882818.83121: stderr chunk (state=3): >>><<< 22286 1726882818.83143: stdout chunk (state=3): >>><<< 22286 1726882818.83170: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882818.83280: _low_level_execute_command(): starting 22286 1726882818.83285: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882818.831778-23831-1641951099639 `" && echo ansible-tmp-1726882818.831778-23831-1641951099639="` echo /root/.ansible/tmp/ansible-tmp-1726882818.831778-23831-1641951099639 `" ) && sleep 0' 22286 1726882818.83861: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882818.83890: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882818.83907: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882818.83935: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22286 1726882818.83955: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 <<< 22286 1726882818.83998: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882818.84050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882818.84111: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882818.84130: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882818.84156: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882818.84314: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882818.86517: stdout chunk (state=3): >>>ansible-tmp-1726882818.831778-23831-1641951099639=/root/.ansible/tmp/ansible-tmp-1726882818.831778-23831-1641951099639 <<< 22286 1726882818.86723: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882818.86726: stdout chunk (state=3): >>><<< 22286 1726882818.86728: stderr chunk (state=3): >>><<< 22286 1726882818.86940: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882818.831778-23831-1641951099639=/root/.ansible/tmp/ansible-tmp-1726882818.831778-23831-1641951099639 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882818.86944: variable 'ansible_module_compression' from source: unknown 22286 1726882818.86947: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22286nvkga28q/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 22286 1726882818.86949: variable 'ansible_facts' from source: unknown 22286 1726882818.86967: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882818.831778-23831-1641951099639/AnsiballZ_command.py 22286 1726882818.87205: Sending initial data 22286 1726882818.87208: Sent initial data (153 bytes) 22286 1726882818.87809: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882818.87940: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882818.87964: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882818.87992: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882818.88139: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882818.89975: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22286 1726882818.90107: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22286 1726882818.90233: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22286nvkga28q/tmp1qv2z_h2 /root/.ansible/tmp/ansible-tmp-1726882818.831778-23831-1641951099639/AnsiballZ_command.py <<< 22286 1726882818.90238: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882818.831778-23831-1641951099639/AnsiballZ_command.py" <<< 22286 1726882818.90357: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-22286nvkga28q/tmp1qv2z_h2" to remote "/root/.ansible/tmp/ansible-tmp-1726882818.831778-23831-1641951099639/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882818.831778-23831-1641951099639/AnsiballZ_command.py" <<< 22286 1726882818.91807: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882818.91840: stderr chunk (state=3): >>><<< 22286 1726882818.91965: stdout chunk (state=3): >>><<< 22286 1726882818.91969: done transferring module to remote 22286 1726882818.91971: _low_level_execute_command(): starting 22286 1726882818.91974: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882818.831778-23831-1641951099639/ /root/.ansible/tmp/ansible-tmp-1726882818.831778-23831-1641951099639/AnsiballZ_command.py && sleep 0' 22286 1726882818.92563: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882818.92579: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882818.92596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882818.92628: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22286 1726882818.92651: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 <<< 22286 1726882818.92748: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882818.92779: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882818.92796: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882818.92817: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882818.92978: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882818.95117: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882818.95127: stdout chunk (state=3): >>><<< 22286 1726882818.95163: stderr chunk (state=3): >>><<< 22286 1726882818.95369: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882818.95372: _low_level_execute_command(): starting 22286 1726882818.95375: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882818.831778-23831-1641951099639/AnsiballZ_command.py && sleep 0' 22286 1726882818.96551: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882818.96754: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882818.96783: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882818.96946: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882819.16670: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "veth0", "type", "veth"], "start": "2024-09-20 21:40:19.145633", "end": "2024-09-20 21:40:19.163897", "delta": "0:00:00.018264", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del veth0 type veth", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 22286 1726882819.18607: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. <<< 22286 1726882819.18612: stdout chunk (state=3): >>><<< 22286 1726882819.18618: stderr chunk (state=3): >>><<< 22286 1726882819.18639: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "veth0", "type", "veth"], "start": "2024-09-20 21:40:19.145633", "end": "2024-09-20 21:40:19.163897", "delta": "0:00:00.018264", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del veth0 type veth", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. 22286 1726882819.18678: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del veth0 type veth', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882818.831778-23831-1641951099639/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22286 1726882819.18686: _low_level_execute_command(): starting 22286 1726882819.18691: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882818.831778-23831-1641951099639/ > /dev/null 2>&1 && sleep 0' 22286 1726882819.19355: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882819.19360: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882819.19363: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration <<< 22286 1726882819.19412: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882819.19451: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882819.19462: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882819.19597: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882819.21655: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882819.21711: stderr chunk (state=3): >>><<< 22286 1726882819.21714: stdout chunk (state=3): >>><<< 22286 1726882819.21723: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882819.21749: handler run complete 22286 1726882819.21808: Evaluated conditional (False): False 22286 1726882819.21811: attempt loop complete, returning result 22286 1726882819.21814: _execute() done 22286 1726882819.21817: dumping result to json 22286 1726882819.21825: done dumping result, returning 22286 1726882819.21835: done running TaskExecutor() for managed_node3/TASK: Delete veth interface veth0 [0affe814-3a2d-a75d-4836-0000000005d2] 22286 1726882819.21842: sending task result for task 0affe814-3a2d-a75d-4836-0000000005d2 22286 1726882819.21954: done sending task result for task 0affe814-3a2d-a75d-4836-0000000005d2 22286 1726882819.21957: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ip", "link", "del", "veth0", "type", "veth" ], "delta": "0:00:00.018264", "end": "2024-09-20 21:40:19.163897", "rc": 0, "start": "2024-09-20 21:40:19.145633" } 22286 1726882819.22086: no more pending results, returning what we have 22286 1726882819.22094: results queue empty 22286 1726882819.22096: checking for any_errors_fatal 22286 1726882819.22102: done checking for any_errors_fatal 22286 1726882819.22103: checking for max_fail_percentage 22286 1726882819.22105: done checking for max_fail_percentage 22286 1726882819.22106: checking to see if all hosts have failed and the running result is not ok 22286 1726882819.22107: done checking to see if all hosts have failed 22286 1726882819.22108: getting the remaining hosts for this loop 22286 1726882819.22110: done getting the remaining hosts for this loop 22286 1726882819.22114: getting the next task for host managed_node3 22286 1726882819.22121: done getting next task for host managed_node3 22286 1726882819.22124: ^ task is: TASK: Create dummy interface {{ interface }} 22286 1726882819.22127: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22286 1726882819.22131: getting variables 22286 1726882819.22133: in VariableManager get_vars() 22286 1726882819.22218: Calling all_inventory to load vars for managed_node3 22286 1726882819.22221: Calling groups_inventory to load vars for managed_node3 22286 1726882819.22224: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882819.22238: Calling all_plugins_play to load vars for managed_node3 22286 1726882819.22241: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882819.22245: Calling groups_plugins_play to load vars for managed_node3 22286 1726882819.23856: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882819.25433: done with get_vars() 22286 1726882819.25460: done getting variables 22286 1726882819.25510: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 22286 1726882819.25610: variable 'interface' from source: play vars TASK [Create dummy interface veth0] ******************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Friday 20 September 2024 21:40:19 -0400 (0:00:00.486) 0:00:42.650 ****** 22286 1726882819.25673: entering _queue_task() for managed_node3/command 22286 1726882819.26104: worker is 1 (out of 1 available) 22286 1726882819.26121: exiting _queue_task() for managed_node3/command 22286 1726882819.26146: done queuing things up, now waiting for results queue to drain 22286 1726882819.26148: waiting for pending results... 22286 1726882819.26482: running TaskExecutor() for managed_node3/TASK: Create dummy interface veth0 22286 1726882819.26569: in run() - task 0affe814-3a2d-a75d-4836-0000000005d3 22286 1726882819.26615: variable 'ansible_search_path' from source: unknown 22286 1726882819.26629: variable 'ansible_search_path' from source: unknown 22286 1726882819.26636: calling self._execute() 22286 1726882819.26763: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882819.26768: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882819.26781: variable 'omit' from source: magic vars 22286 1726882819.27208: variable 'ansible_distribution_major_version' from source: facts 22286 1726882819.27218: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882819.27469: variable 'type' from source: play vars 22286 1726882819.27497: variable 'state' from source: include params 22286 1726882819.27501: variable 'interface' from source: play vars 22286 1726882819.27504: variable 'current_interfaces' from source: set_fact 22286 1726882819.27507: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 22286 1726882819.27510: when evaluation is False, skipping this task 22286 1726882819.27513: _execute() done 22286 1726882819.27515: dumping result to json 22286 1726882819.27518: done dumping result, returning 22286 1726882819.27520: done running TaskExecutor() for managed_node3/TASK: Create dummy interface veth0 [0affe814-3a2d-a75d-4836-0000000005d3] 22286 1726882819.27537: sending task result for task 0affe814-3a2d-a75d-4836-0000000005d3 22286 1726882819.27620: done sending task result for task 0affe814-3a2d-a75d-4836-0000000005d3 22286 1726882819.27623: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 22286 1726882819.27693: no more pending results, returning what we have 22286 1726882819.27701: results queue empty 22286 1726882819.27706: checking for any_errors_fatal 22286 1726882819.27714: done checking for any_errors_fatal 22286 1726882819.27715: checking for max_fail_percentage 22286 1726882819.27717: done checking for max_fail_percentage 22286 1726882819.27718: checking to see if all hosts have failed and the running result is not ok 22286 1726882819.27719: done checking to see if all hosts have failed 22286 1726882819.27720: getting the remaining hosts for this loop 22286 1726882819.27721: done getting the remaining hosts for this loop 22286 1726882819.27725: getting the next task for host managed_node3 22286 1726882819.27731: done getting next task for host managed_node3 22286 1726882819.27742: ^ task is: TASK: Delete dummy interface {{ interface }} 22286 1726882819.27746: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22286 1726882819.27750: getting variables 22286 1726882819.27752: in VariableManager get_vars() 22286 1726882819.27798: Calling all_inventory to load vars for managed_node3 22286 1726882819.27801: Calling groups_inventory to load vars for managed_node3 22286 1726882819.27806: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882819.27866: Calling all_plugins_play to load vars for managed_node3 22286 1726882819.27870: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882819.27877: Calling groups_plugins_play to load vars for managed_node3 22286 1726882819.29344: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882819.31258: done with get_vars() 22286 1726882819.31282: done getting variables 22286 1726882819.31336: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 22286 1726882819.31423: variable 'interface' from source: play vars TASK [Delete dummy interface veth0] ******************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Friday 20 September 2024 21:40:19 -0400 (0:00:00.057) 0:00:42.707 ****** 22286 1726882819.31450: entering _queue_task() for managed_node3/command 22286 1726882819.31699: worker is 1 (out of 1 available) 22286 1726882819.31713: exiting _queue_task() for managed_node3/command 22286 1726882819.31725: done queuing things up, now waiting for results queue to drain 22286 1726882819.31727: waiting for pending results... 22286 1726882819.31922: running TaskExecutor() for managed_node3/TASK: Delete dummy interface veth0 22286 1726882819.32009: in run() - task 0affe814-3a2d-a75d-4836-0000000005d4 22286 1726882819.32023: variable 'ansible_search_path' from source: unknown 22286 1726882819.32026: variable 'ansible_search_path' from source: unknown 22286 1726882819.32061: calling self._execute() 22286 1726882819.32226: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882819.32231: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882819.32237: variable 'omit' from source: magic vars 22286 1726882819.32999: variable 'ansible_distribution_major_version' from source: facts 22286 1726882819.33004: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882819.33295: variable 'type' from source: play vars 22286 1726882819.33299: variable 'state' from source: include params 22286 1726882819.33302: variable 'interface' from source: play vars 22286 1726882819.33305: variable 'current_interfaces' from source: set_fact 22286 1726882819.33310: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 22286 1726882819.33312: when evaluation is False, skipping this task 22286 1726882819.33347: _execute() done 22286 1726882819.33351: dumping result to json 22286 1726882819.33354: done dumping result, returning 22286 1726882819.33357: done running TaskExecutor() for managed_node3/TASK: Delete dummy interface veth0 [0affe814-3a2d-a75d-4836-0000000005d4] 22286 1726882819.33399: sending task result for task 0affe814-3a2d-a75d-4836-0000000005d4 skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 22286 1726882819.33615: no more pending results, returning what we have 22286 1726882819.33620: results queue empty 22286 1726882819.33621: checking for any_errors_fatal 22286 1726882819.33626: done checking for any_errors_fatal 22286 1726882819.33627: checking for max_fail_percentage 22286 1726882819.33629: done checking for max_fail_percentage 22286 1726882819.33632: checking to see if all hosts have failed and the running result is not ok 22286 1726882819.33635: done checking to see if all hosts have failed 22286 1726882819.33636: getting the remaining hosts for this loop 22286 1726882819.33638: done getting the remaining hosts for this loop 22286 1726882819.33643: getting the next task for host managed_node3 22286 1726882819.33650: done getting next task for host managed_node3 22286 1726882819.33653: ^ task is: TASK: Create tap interface {{ interface }} 22286 1726882819.33662: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22286 1726882819.33668: getting variables 22286 1726882819.33670: in VariableManager get_vars() 22286 1726882819.33728: Calling all_inventory to load vars for managed_node3 22286 1726882819.33732: Calling groups_inventory to load vars for managed_node3 22286 1726882819.33737: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882819.33749: Calling all_plugins_play to load vars for managed_node3 22286 1726882819.33752: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882819.33756: Calling groups_plugins_play to load vars for managed_node3 22286 1726882819.34358: done sending task result for task 0affe814-3a2d-a75d-4836-0000000005d4 22286 1726882819.34362: WORKER PROCESS EXITING 22286 1726882819.35607: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882819.39726: done with get_vars() 22286 1726882819.39763: done getting variables 22286 1726882819.39820: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 22286 1726882819.39919: variable 'interface' from source: play vars TASK [Create tap interface veth0] ********************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Friday 20 September 2024 21:40:19 -0400 (0:00:00.085) 0:00:42.793 ****** 22286 1726882819.39963: entering _queue_task() for managed_node3/command 22286 1726882819.40346: worker is 1 (out of 1 available) 22286 1726882819.40363: exiting _queue_task() for managed_node3/command 22286 1726882819.40378: done queuing things up, now waiting for results queue to drain 22286 1726882819.40379: waiting for pending results... 22286 1726882819.40861: running TaskExecutor() for managed_node3/TASK: Create tap interface veth0 22286 1726882819.40900: in run() - task 0affe814-3a2d-a75d-4836-0000000005d5 22286 1726882819.40929: variable 'ansible_search_path' from source: unknown 22286 1726882819.40941: variable 'ansible_search_path' from source: unknown 22286 1726882819.40991: calling self._execute() 22286 1726882819.41251: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882819.41255: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882819.41259: variable 'omit' from source: magic vars 22286 1726882819.41962: variable 'ansible_distribution_major_version' from source: facts 22286 1726882819.41989: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882819.42264: variable 'type' from source: play vars 22286 1726882819.42268: variable 'state' from source: include params 22286 1726882819.42274: variable 'interface' from source: play vars 22286 1726882819.42282: variable 'current_interfaces' from source: set_fact 22286 1726882819.42294: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 22286 1726882819.42300: when evaluation is False, skipping this task 22286 1726882819.42303: _execute() done 22286 1726882819.42306: dumping result to json 22286 1726882819.42308: done dumping result, returning 22286 1726882819.42311: done running TaskExecutor() for managed_node3/TASK: Create tap interface veth0 [0affe814-3a2d-a75d-4836-0000000005d5] 22286 1726882819.42320: sending task result for task 0affe814-3a2d-a75d-4836-0000000005d5 skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 22286 1726882819.42482: no more pending results, returning what we have 22286 1726882819.42488: results queue empty 22286 1726882819.42489: checking for any_errors_fatal 22286 1726882819.42495: done checking for any_errors_fatal 22286 1726882819.42495: checking for max_fail_percentage 22286 1726882819.42498: done checking for max_fail_percentage 22286 1726882819.42499: checking to see if all hosts have failed and the running result is not ok 22286 1726882819.42505: done checking to see if all hosts have failed 22286 1726882819.42506: getting the remaining hosts for this loop 22286 1726882819.42508: done getting the remaining hosts for this loop 22286 1726882819.42516: getting the next task for host managed_node3 22286 1726882819.42524: done getting next task for host managed_node3 22286 1726882819.42527: ^ task is: TASK: Delete tap interface {{ interface }} 22286 1726882819.42533: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22286 1726882819.42539: getting variables 22286 1726882819.42540: in VariableManager get_vars() 22286 1726882819.42590: Calling all_inventory to load vars for managed_node3 22286 1726882819.42593: Calling groups_inventory to load vars for managed_node3 22286 1726882819.42595: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882819.42611: Calling all_plugins_play to load vars for managed_node3 22286 1726882819.42615: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882819.42619: Calling groups_plugins_play to load vars for managed_node3 22286 1726882819.42639: done sending task result for task 0affe814-3a2d-a75d-4836-0000000005d5 22286 1726882819.42643: WORKER PROCESS EXITING 22286 1726882819.44819: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882819.47012: done with get_vars() 22286 1726882819.47039: done getting variables 22286 1726882819.47090: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 22286 1726882819.47237: variable 'interface' from source: play vars TASK [Delete tap interface veth0] ********************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Friday 20 September 2024 21:40:19 -0400 (0:00:00.072) 0:00:42.866 ****** 22286 1726882819.47263: entering _queue_task() for managed_node3/command 22286 1726882819.47946: worker is 1 (out of 1 available) 22286 1726882819.47956: exiting _queue_task() for managed_node3/command 22286 1726882819.47976: done queuing things up, now waiting for results queue to drain 22286 1726882819.47978: waiting for pending results... 22286 1726882819.48250: running TaskExecutor() for managed_node3/TASK: Delete tap interface veth0 22286 1726882819.48256: in run() - task 0affe814-3a2d-a75d-4836-0000000005d6 22286 1726882819.48262: variable 'ansible_search_path' from source: unknown 22286 1726882819.48265: variable 'ansible_search_path' from source: unknown 22286 1726882819.48320: calling self._execute() 22286 1726882819.48451: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882819.48456: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882819.48468: variable 'omit' from source: magic vars 22286 1726882819.48997: variable 'ansible_distribution_major_version' from source: facts 22286 1726882819.49002: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882819.49465: variable 'type' from source: play vars 22286 1726882819.49469: variable 'state' from source: include params 22286 1726882819.49472: variable 'interface' from source: play vars 22286 1726882819.49474: variable 'current_interfaces' from source: set_fact 22286 1726882819.49479: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 22286 1726882819.49481: when evaluation is False, skipping this task 22286 1726882819.49487: _execute() done 22286 1726882819.49490: dumping result to json 22286 1726882819.49564: done dumping result, returning 22286 1726882819.49604: done running TaskExecutor() for managed_node3/TASK: Delete tap interface veth0 [0affe814-3a2d-a75d-4836-0000000005d6] 22286 1726882819.49632: sending task result for task 0affe814-3a2d-a75d-4836-0000000005d6 skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 22286 1726882819.49947: no more pending results, returning what we have 22286 1726882819.49952: results queue empty 22286 1726882819.49954: checking for any_errors_fatal 22286 1726882819.49959: done checking for any_errors_fatal 22286 1726882819.49960: checking for max_fail_percentage 22286 1726882819.49963: done checking for max_fail_percentage 22286 1726882819.49964: checking to see if all hosts have failed and the running result is not ok 22286 1726882819.49966: done checking to see if all hosts have failed 22286 1726882819.49966: getting the remaining hosts for this loop 22286 1726882819.49968: done getting the remaining hosts for this loop 22286 1726882819.49973: getting the next task for host managed_node3 22286 1726882819.49983: done getting next task for host managed_node3 22286 1726882819.49987: ^ task is: TASK: Clean up namespace 22286 1726882819.49991: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=6, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882819.49996: getting variables 22286 1726882819.49998: in VariableManager get_vars() 22286 1726882819.50092: Calling all_inventory to load vars for managed_node3 22286 1726882819.50100: Calling groups_inventory to load vars for managed_node3 22286 1726882819.50107: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882819.50121: Calling all_plugins_play to load vars for managed_node3 22286 1726882819.50125: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882819.50129: Calling groups_plugins_play to load vars for managed_node3 22286 1726882819.50771: done sending task result for task 0affe814-3a2d-a75d-4836-0000000005d6 22286 1726882819.50775: WORKER PROCESS EXITING 22286 1726882819.51697: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882819.53936: done with get_vars() 22286 1726882819.53959: done getting variables 22286 1726882819.54012: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Clean up namespace] ****************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:108 Friday 20 September 2024 21:40:19 -0400 (0:00:00.067) 0:00:42.933 ****** 22286 1726882819.54037: entering _queue_task() for managed_node3/command 22286 1726882819.54291: worker is 1 (out of 1 available) 22286 1726882819.54307: exiting _queue_task() for managed_node3/command 22286 1726882819.54321: done queuing things up, now waiting for results queue to drain 22286 1726882819.54323: waiting for pending results... 22286 1726882819.54503: running TaskExecutor() for managed_node3/TASK: Clean up namespace 22286 1726882819.54580: in run() - task 0affe814-3a2d-a75d-4836-0000000000b4 22286 1726882819.54593: variable 'ansible_search_path' from source: unknown 22286 1726882819.54624: calling self._execute() 22286 1726882819.54710: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882819.54717: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882819.54727: variable 'omit' from source: magic vars 22286 1726882819.55039: variable 'ansible_distribution_major_version' from source: facts 22286 1726882819.55051: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882819.55058: variable 'omit' from source: magic vars 22286 1726882819.55078: variable 'omit' from source: magic vars 22286 1726882819.55108: variable 'omit' from source: magic vars 22286 1726882819.55146: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22286 1726882819.55179: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22286 1726882819.55197: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22286 1726882819.55216: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882819.55230: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882819.55259: variable 'inventory_hostname' from source: host vars for 'managed_node3' 22286 1726882819.55263: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882819.55269: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882819.55361: Set connection var ansible_shell_executable to /bin/sh 22286 1726882819.55369: Set connection var ansible_module_compression to ZIP_DEFLATED 22286 1726882819.55372: Set connection var ansible_connection to ssh 22286 1726882819.55374: Set connection var ansible_shell_type to sh 22286 1726882819.55383: Set connection var ansible_timeout to 10 22286 1726882819.55391: Set connection var ansible_pipelining to False 22286 1726882819.55411: variable 'ansible_shell_executable' from source: unknown 22286 1726882819.55414: variable 'ansible_connection' from source: unknown 22286 1726882819.55418: variable 'ansible_module_compression' from source: unknown 22286 1726882819.55422: variable 'ansible_shell_type' from source: unknown 22286 1726882819.55424: variable 'ansible_shell_executable' from source: unknown 22286 1726882819.55435: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882819.55438: variable 'ansible_pipelining' from source: unknown 22286 1726882819.55442: variable 'ansible_timeout' from source: unknown 22286 1726882819.55445: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882819.55564: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22286 1726882819.55574: variable 'omit' from source: magic vars 22286 1726882819.55580: starting attempt loop 22286 1726882819.55583: running the handler 22286 1726882819.55598: _low_level_execute_command(): starting 22286 1726882819.55606: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22286 1726882819.56152: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882819.56155: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found <<< 22286 1726882819.56159: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882819.56162: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882819.56223: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882819.56227: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882819.56230: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882819.56357: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882819.58194: stdout chunk (state=3): >>>/root <<< 22286 1726882819.58308: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882819.58360: stderr chunk (state=3): >>><<< 22286 1726882819.58365: stdout chunk (state=3): >>><<< 22286 1726882819.58388: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882819.58401: _low_level_execute_command(): starting 22286 1726882819.58408: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882819.5838883-23876-230989224189504 `" && echo ansible-tmp-1726882819.5838883-23876-230989224189504="` echo /root/.ansible/tmp/ansible-tmp-1726882819.5838883-23876-230989224189504 `" ) && sleep 0' 22286 1726882819.58827: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882819.58859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found <<< 22286 1726882819.58863: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882819.58866: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882819.58924: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882819.58930: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882819.59052: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882819.61172: stdout chunk (state=3): >>>ansible-tmp-1726882819.5838883-23876-230989224189504=/root/.ansible/tmp/ansible-tmp-1726882819.5838883-23876-230989224189504 <<< 22286 1726882819.61294: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882819.61342: stderr chunk (state=3): >>><<< 22286 1726882819.61345: stdout chunk (state=3): >>><<< 22286 1726882819.61360: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882819.5838883-23876-230989224189504=/root/.ansible/tmp/ansible-tmp-1726882819.5838883-23876-230989224189504 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882819.61388: variable 'ansible_module_compression' from source: unknown 22286 1726882819.61432: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22286nvkga28q/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 22286 1726882819.61462: variable 'ansible_facts' from source: unknown 22286 1726882819.61529: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882819.5838883-23876-230989224189504/AnsiballZ_command.py 22286 1726882819.61638: Sending initial data 22286 1726882819.61642: Sent initial data (156 bytes) 22286 1726882819.62087: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882819.62091: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found <<< 22286 1726882819.62093: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration <<< 22286 1726882819.62097: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882819.62155: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882819.62160: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882819.62273: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882819.63990: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 <<< 22286 1726882819.63996: stderr chunk (state=3): >>>debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22286 1726882819.64104: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22286 1726882819.64220: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22286nvkga28q/tmp79113ub2 /root/.ansible/tmp/ansible-tmp-1726882819.5838883-23876-230989224189504/AnsiballZ_command.py <<< 22286 1726882819.64224: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882819.5838883-23876-230989224189504/AnsiballZ_command.py" <<< 22286 1726882819.64331: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-22286nvkga28q/tmp79113ub2" to remote "/root/.ansible/tmp/ansible-tmp-1726882819.5838883-23876-230989224189504/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882819.5838883-23876-230989224189504/AnsiballZ_command.py" <<< 22286 1726882819.65407: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882819.65466: stderr chunk (state=3): >>><<< 22286 1726882819.65470: stdout chunk (state=3): >>><<< 22286 1726882819.65491: done transferring module to remote 22286 1726882819.65505: _low_level_execute_command(): starting 22286 1726882819.65508: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882819.5838883-23876-230989224189504/ /root/.ansible/tmp/ansible-tmp-1726882819.5838883-23876-230989224189504/AnsiballZ_command.py && sleep 0' 22286 1726882819.65928: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882819.65964: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22286 1726882819.65968: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882819.65970: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882819.65973: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882819.66025: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882819.66032: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882819.66149: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882819.68090: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882819.68138: stderr chunk (state=3): >>><<< 22286 1726882819.68144: stdout chunk (state=3): >>><<< 22286 1726882819.68156: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882819.68160: _low_level_execute_command(): starting 22286 1726882819.68165: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882819.5838883-23876-230989224189504/AnsiballZ_command.py && sleep 0' 22286 1726882819.68611: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882819.68615: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882819.68618: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882819.68620: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found <<< 22286 1726882819.68623: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882819.68671: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882819.68674: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882819.68798: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882819.86790: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "netns", "delete", "ns1"], "start": "2024-09-20 21:40:19.860069", "end": "2024-09-20 21:40:19.865061", "delta": "0:00:00.004992", "msg": "", "invocation": {"module_args": {"_raw_params": "ip netns delete ns1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 22286 1726882819.88497: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. <<< 22286 1726882819.88739: stderr chunk (state=3): >>><<< 22286 1726882819.88743: stdout chunk (state=3): >>><<< 22286 1726882819.88745: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "netns", "delete", "ns1"], "start": "2024-09-20 21:40:19.860069", "end": "2024-09-20 21:40:19.865061", "delta": "0:00:00.004992", "msg": "", "invocation": {"module_args": {"_raw_params": "ip netns delete ns1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. 22286 1726882819.88749: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip netns delete ns1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882819.5838883-23876-230989224189504/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22286 1726882819.88753: _low_level_execute_command(): starting 22286 1726882819.88755: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882819.5838883-23876-230989224189504/ > /dev/null 2>&1 && sleep 0' 22286 1726882819.90020: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 22286 1726882819.90039: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882819.90190: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882819.90194: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882819.90351: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882819.90430: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882819.92495: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882819.92744: stderr chunk (state=3): >>><<< 22286 1726882819.92747: stdout chunk (state=3): >>><<< 22286 1726882819.92749: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882819.92751: handler run complete 22286 1726882819.92753: Evaluated conditional (False): False 22286 1726882819.92754: attempt loop complete, returning result 22286 1726882819.92756: _execute() done 22286 1726882819.92758: dumping result to json 22286 1726882819.92759: done dumping result, returning 22286 1726882819.92762: done running TaskExecutor() for managed_node3/TASK: Clean up namespace [0affe814-3a2d-a75d-4836-0000000000b4] 22286 1726882819.92764: sending task result for task 0affe814-3a2d-a75d-4836-0000000000b4 22286 1726882819.92838: done sending task result for task 0affe814-3a2d-a75d-4836-0000000000b4 22286 1726882819.92842: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ip", "netns", "delete", "ns1" ], "delta": "0:00:00.004992", "end": "2024-09-20 21:40:19.865061", "rc": 0, "start": "2024-09-20 21:40:19.860069" } 22286 1726882819.92925: no more pending results, returning what we have 22286 1726882819.92929: results queue empty 22286 1726882819.92930: checking for any_errors_fatal 22286 1726882819.92935: done checking for any_errors_fatal 22286 1726882819.92936: checking for max_fail_percentage 22286 1726882819.92938: done checking for max_fail_percentage 22286 1726882819.92939: checking to see if all hosts have failed and the running result is not ok 22286 1726882819.92940: done checking to see if all hosts have failed 22286 1726882819.92941: getting the remaining hosts for this loop 22286 1726882819.92943: done getting the remaining hosts for this loop 22286 1726882819.92947: getting the next task for host managed_node3 22286 1726882819.92952: done getting next task for host managed_node3 22286 1726882819.92957: ^ task is: TASK: Verify network state restored to default 22286 1726882819.92959: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=7, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882819.92962: getting variables 22286 1726882819.92964: in VariableManager get_vars() 22286 1726882819.93005: Calling all_inventory to load vars for managed_node3 22286 1726882819.93008: Calling groups_inventory to load vars for managed_node3 22286 1726882819.93011: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882819.93022: Calling all_plugins_play to load vars for managed_node3 22286 1726882819.93025: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882819.93028: Calling groups_plugins_play to load vars for managed_node3 22286 1726882819.99456: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882820.02403: done with get_vars() 22286 1726882820.02438: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:113 Friday 20 September 2024 21:40:20 -0400 (0:00:00.484) 0:00:43.418 ****** 22286 1726882820.02532: entering _queue_task() for managed_node3/include_tasks 22286 1726882820.02897: worker is 1 (out of 1 available) 22286 1726882820.02911: exiting _queue_task() for managed_node3/include_tasks 22286 1726882820.02924: done queuing things up, now waiting for results queue to drain 22286 1726882820.02926: waiting for pending results... 22286 1726882820.03259: running TaskExecutor() for managed_node3/TASK: Verify network state restored to default 22286 1726882820.03446: in run() - task 0affe814-3a2d-a75d-4836-0000000000b5 22286 1726882820.03450: variable 'ansible_search_path' from source: unknown 22286 1726882820.03453: calling self._execute() 22286 1726882820.03519: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882820.03528: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882820.03543: variable 'omit' from source: magic vars 22286 1726882820.04005: variable 'ansible_distribution_major_version' from source: facts 22286 1726882820.04018: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882820.04024: _execute() done 22286 1726882820.04030: dumping result to json 22286 1726882820.04037: done dumping result, returning 22286 1726882820.04044: done running TaskExecutor() for managed_node3/TASK: Verify network state restored to default [0affe814-3a2d-a75d-4836-0000000000b5] 22286 1726882820.04051: sending task result for task 0affe814-3a2d-a75d-4836-0000000000b5 22286 1726882820.04169: done sending task result for task 0affe814-3a2d-a75d-4836-0000000000b5 22286 1726882820.04172: WORKER PROCESS EXITING 22286 1726882820.04206: no more pending results, returning what we have 22286 1726882820.04213: in VariableManager get_vars() 22286 1726882820.04268: Calling all_inventory to load vars for managed_node3 22286 1726882820.04272: Calling groups_inventory to load vars for managed_node3 22286 1726882820.04277: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882820.04294: Calling all_plugins_play to load vars for managed_node3 22286 1726882820.04298: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882820.04301: Calling groups_plugins_play to load vars for managed_node3 22286 1726882820.06750: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882820.09618: done with get_vars() 22286 1726882820.09651: variable 'ansible_search_path' from source: unknown 22286 1726882820.09668: we have included files to process 22286 1726882820.09669: generating all_blocks data 22286 1726882820.09671: done generating all_blocks data 22286 1726882820.09681: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 22286 1726882820.09682: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 22286 1726882820.09685: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 22286 1726882820.10201: done processing included file 22286 1726882820.10204: iterating over new_blocks loaded from include file 22286 1726882820.10205: in VariableManager get_vars() 22286 1726882820.10228: done with get_vars() 22286 1726882820.10230: filtering new block on tags 22286 1726882820.10255: done filtering new block on tags 22286 1726882820.10258: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed_node3 22286 1726882820.10263: extending task lists for all hosts with included blocks 22286 1726882820.15807: done extending task lists 22286 1726882820.15808: done processing included files 22286 1726882820.15809: results queue empty 22286 1726882820.15810: checking for any_errors_fatal 22286 1726882820.15817: done checking for any_errors_fatal 22286 1726882820.15818: checking for max_fail_percentage 22286 1726882820.16036: done checking for max_fail_percentage 22286 1726882820.16038: checking to see if all hosts have failed and the running result is not ok 22286 1726882820.16040: done checking to see if all hosts have failed 22286 1726882820.16041: getting the remaining hosts for this loop 22286 1726882820.16042: done getting the remaining hosts for this loop 22286 1726882820.16046: getting the next task for host managed_node3 22286 1726882820.16051: done getting next task for host managed_node3 22286 1726882820.16053: ^ task is: TASK: Check routes and DNS 22286 1726882820.16056: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=8, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22286 1726882820.16058: getting variables 22286 1726882820.16060: in VariableManager get_vars() 22286 1726882820.16080: Calling all_inventory to load vars for managed_node3 22286 1726882820.16083: Calling groups_inventory to load vars for managed_node3 22286 1726882820.16087: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882820.16094: Calling all_plugins_play to load vars for managed_node3 22286 1726882820.16097: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882820.16101: Calling groups_plugins_play to load vars for managed_node3 22286 1726882820.18607: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882820.24073: done with get_vars() 22286 1726882820.24105: done getting variables 22286 1726882820.24168: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Friday 20 September 2024 21:40:20 -0400 (0:00:00.216) 0:00:43.635 ****** 22286 1726882820.24204: entering _queue_task() for managed_node3/shell 22286 1726882820.24668: worker is 1 (out of 1 available) 22286 1726882820.24683: exiting _queue_task() for managed_node3/shell 22286 1726882820.24697: done queuing things up, now waiting for results queue to drain 22286 1726882820.24699: waiting for pending results... 22286 1726882820.25280: running TaskExecutor() for managed_node3/TASK: Check routes and DNS 22286 1726882820.25614: in run() - task 0affe814-3a2d-a75d-4836-00000000075e 22286 1726882820.25632: variable 'ansible_search_path' from source: unknown 22286 1726882820.25637: variable 'ansible_search_path' from source: unknown 22286 1726882820.25811: calling self._execute() 22286 1726882820.25939: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882820.25950: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882820.25962: variable 'omit' from source: magic vars 22286 1726882820.26926: variable 'ansible_distribution_major_version' from source: facts 22286 1726882820.27074: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882820.27085: variable 'omit' from source: magic vars 22286 1726882820.27131: variable 'omit' from source: magic vars 22286 1726882820.27246: variable 'omit' from source: magic vars 22286 1726882820.27297: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22286 1726882820.27401: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22286 1726882820.27428: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22286 1726882820.27453: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882820.27467: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22286 1726882820.27630: variable 'inventory_hostname' from source: host vars for 'managed_node3' 22286 1726882820.27656: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882820.27664: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882820.27884: Set connection var ansible_shell_executable to /bin/sh 22286 1726882820.27894: Set connection var ansible_module_compression to ZIP_DEFLATED 22286 1726882820.27898: Set connection var ansible_connection to ssh 22286 1726882820.27900: Set connection var ansible_shell_type to sh 22286 1726882820.27907: Set connection var ansible_timeout to 10 22286 1726882820.27918: Set connection var ansible_pipelining to False 22286 1726882820.28320: variable 'ansible_shell_executable' from source: unknown 22286 1726882820.28323: variable 'ansible_connection' from source: unknown 22286 1726882820.28327: variable 'ansible_module_compression' from source: unknown 22286 1726882820.28329: variable 'ansible_shell_type' from source: unknown 22286 1726882820.28332: variable 'ansible_shell_executable' from source: unknown 22286 1726882820.28336: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882820.28338: variable 'ansible_pipelining' from source: unknown 22286 1726882820.28341: variable 'ansible_timeout' from source: unknown 22286 1726882820.28343: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882820.28740: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22286 1726882820.28753: variable 'omit' from source: magic vars 22286 1726882820.28759: starting attempt loop 22286 1726882820.28763: running the handler 22286 1726882820.28792: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22286 1726882820.28839: _low_level_execute_command(): starting 22286 1726882820.28842: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22286 1726882820.30376: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882820.30392: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882820.30405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882820.30420: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22286 1726882820.30434: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 <<< 22286 1726882820.30542: stderr chunk (state=3): >>>debug2: match not found <<< 22286 1726882820.30548: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882820.30550: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 22286 1726882820.30552: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.41.238 is address <<< 22286 1726882820.30583: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 22286 1726882820.30759: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882820.30903: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882820.32785: stdout chunk (state=3): >>>/root <<< 22286 1726882820.33090: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882820.33096: stdout chunk (state=3): >>><<< 22286 1726882820.33107: stderr chunk (state=3): >>><<< 22286 1726882820.33132: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882820.33148: _low_level_execute_command(): starting 22286 1726882820.33155: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882820.3313174-23904-64940549953643 `" && echo ansible-tmp-1726882820.3313174-23904-64940549953643="` echo /root/.ansible/tmp/ansible-tmp-1726882820.3313174-23904-64940549953643 `" ) && sleep 0' 22286 1726882820.34439: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882820.34456: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found <<< 22286 1726882820.34459: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882820.34462: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882820.34465: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882820.34700: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882820.34942: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882820.37017: stdout chunk (state=3): >>>ansible-tmp-1726882820.3313174-23904-64940549953643=/root/.ansible/tmp/ansible-tmp-1726882820.3313174-23904-64940549953643 <<< 22286 1726882820.37207: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882820.37210: stdout chunk (state=3): >>><<< 22286 1726882820.37219: stderr chunk (state=3): >>><<< 22286 1726882820.37243: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882820.3313174-23904-64940549953643=/root/.ansible/tmp/ansible-tmp-1726882820.3313174-23904-64940549953643 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882820.37277: variable 'ansible_module_compression' from source: unknown 22286 1726882820.37337: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22286nvkga28q/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 22286 1726882820.37375: variable 'ansible_facts' from source: unknown 22286 1726882820.37766: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882820.3313174-23904-64940549953643/AnsiballZ_command.py 22286 1726882820.37995: Sending initial data 22286 1726882820.37998: Sent initial data (155 bytes) 22286 1726882820.39251: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882820.39308: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882820.39559: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882820.39573: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882820.39723: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882820.41514: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 22286 1726882820.41518: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22286 1726882820.41655: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22286 1726882820.41764: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22286nvkga28q/tmp256e66kr /root/.ansible/tmp/ansible-tmp-1726882820.3313174-23904-64940549953643/AnsiballZ_command.py <<< 22286 1726882820.41771: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882820.3313174-23904-64940549953643/AnsiballZ_command.py" <<< 22286 1726882820.41904: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-22286nvkga28q/tmp256e66kr" to remote "/root/.ansible/tmp/ansible-tmp-1726882820.3313174-23904-64940549953643/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882820.3313174-23904-64940549953643/AnsiballZ_command.py" <<< 22286 1726882820.44185: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882820.44191: stderr chunk (state=3): >>><<< 22286 1726882820.44196: stdout chunk (state=3): >>><<< 22286 1726882820.44240: done transferring module to remote 22286 1726882820.44244: _low_level_execute_command(): starting 22286 1726882820.44247: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882820.3313174-23904-64940549953643/ /root/.ansible/tmp/ansible-tmp-1726882820.3313174-23904-64940549953643/AnsiballZ_command.py && sleep 0' 22286 1726882820.45449: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882820.45694: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882820.45698: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882820.45740: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882820.45858: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882820.47978: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882820.48071: stderr chunk (state=3): >>><<< 22286 1726882820.48360: stdout chunk (state=3): >>><<< 22286 1726882820.48364: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882820.48367: _low_level_execute_command(): starting 22286 1726882820.48369: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882820.3313174-23904-64940549953643/AnsiballZ_command.py && sleep 0' 22286 1726882820.49281: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882820.49546: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 22286 1726882820.49560: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882820.49573: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882820.49853: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882820.68516: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 0e:39:03:af:ed:a3 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.41.238/22 brd 10.31.43.255 scope global dynamic noprefixroute eth0\n valid_lft 2654sec preferred_lft 2654sec\n inet6 fe80::a0b7:fdc4:48e8:7158/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.40.1 dev eth0 proto dhcp src 10.31.41.238 metric 100 \n10.31.40.0/22 dev eth0 proto kernel scope link src 10.31.41.238 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8).\n# Do not edit.\n#\n# This file might be symlinked as /etc/resolv.conf. If you're looking at\n# /etc/resolv.conf and seeing this text, you have followed the symlink.\n#\n# This is a dynamic resolv.conf file for connecting local clients to the\n# internal DNS stub resolver of systemd-resolved. This file lists all\n# configured search domains.\n#\n# Run \"resolvectl status\" to see details about the uplink DNS servers\n# currently in use.\n#\n# Third party programs should typically not access this file directly, but only\n# through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a\n# different way, replace this symlink by a static file or a different symlink.\n#\n# See man:systemd-resolved.service(8) for details about the supported modes of\n# operation for /etc/resolv.conf.\n\nnameserver 127.0.0.53\noptions edns0 trust-ad\nsearch us-east-1.aws.redhat.com", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 21:40:20.673867", "end": "2024-09-20 21:40:20.682905", "delta": "0:00:00.009038", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 22286 1726882820.70245: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882820.70254: stderr chunk (state=3): >>>Shared connection to 10.31.41.238 closed. <<< 22286 1726882820.70351: stderr chunk (state=3): >>><<< 22286 1726882820.70372: stdout chunk (state=3): >>><<< 22286 1726882820.70403: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 0e:39:03:af:ed:a3 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.41.238/22 brd 10.31.43.255 scope global dynamic noprefixroute eth0\n valid_lft 2654sec preferred_lft 2654sec\n inet6 fe80::a0b7:fdc4:48e8:7158/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.40.1 dev eth0 proto dhcp src 10.31.41.238 metric 100 \n10.31.40.0/22 dev eth0 proto kernel scope link src 10.31.41.238 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8).\n# Do not edit.\n#\n# This file might be symlinked as /etc/resolv.conf. If you're looking at\n# /etc/resolv.conf and seeing this text, you have followed the symlink.\n#\n# This is a dynamic resolv.conf file for connecting local clients to the\n# internal DNS stub resolver of systemd-resolved. This file lists all\n# configured search domains.\n#\n# Run \"resolvectl status\" to see details about the uplink DNS servers\n# currently in use.\n#\n# Third party programs should typically not access this file directly, but only\n# through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a\n# different way, replace this symlink by a static file or a different symlink.\n#\n# See man:systemd-resolved.service(8) for details about the supported modes of\n# operation for /etc/resolv.conf.\n\nnameserver 127.0.0.53\noptions edns0 trust-ad\nsearch us-east-1.aws.redhat.com", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 21:40:20.673867", "end": "2024-09-20 21:40:20.682905", "delta": "0:00:00.009038", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.41.238 closed. 22286 1726882820.70469: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882820.3313174-23904-64940549953643/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22286 1726882820.70481: _low_level_execute_command(): starting 22286 1726882820.70490: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882820.3313174-23904-64940549953643/ > /dev/null 2>&1 && sleep 0' 22286 1726882820.71098: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22286 1726882820.71108: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882820.71120: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882820.71142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22286 1726882820.71151: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 <<< 22286 1726882820.71160: stderr chunk (state=3): >>>debug2: match not found <<< 22286 1726882820.71171: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882820.71190: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 22286 1726882820.71198: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.41.238 is address <<< 22286 1726882820.71206: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 22286 1726882820.71216: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22286 1726882820.71226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22286 1726882820.71251: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22286 1726882820.71255: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 <<< 22286 1726882820.71339: stderr chunk (state=3): >>>debug2: match found <<< 22286 1726882820.71343: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22286 1726882820.71353: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22286 1726882820.71367: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22286 1726882820.71389: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22286 1726882820.71535: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22286 1726882820.73744: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22286 1726882820.73870: stdout chunk (state=3): >>><<< 22286 1726882820.73873: stderr chunk (state=3): >>><<< 22286 1726882820.73876: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.41.238 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.41.238 originally 10.31.41.238 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22286 1726882820.73879: handler run complete 22286 1726882820.73917: Evaluated conditional (False): False 22286 1726882820.74001: attempt loop complete, returning result 22286 1726882820.74009: _execute() done 22286 1726882820.74018: dumping result to json 22286 1726882820.74031: done dumping result, returning 22286 1726882820.74140: done running TaskExecutor() for managed_node3/TASK: Check routes and DNS [0affe814-3a2d-a75d-4836-00000000075e] 22286 1726882820.74143: sending task result for task 0affe814-3a2d-a75d-4836-00000000075e 22286 1726882820.74551: done sending task result for task 0affe814-3a2d-a75d-4836-00000000075e 22286 1726882820.74555: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.009038", "end": "2024-09-20 21:40:20.682905", "rc": 0, "start": "2024-09-20 21:40:20.673867" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 0e:39:03:af:ed:a3 brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.41.238/22 brd 10.31.43.255 scope global dynamic noprefixroute eth0 valid_lft 2654sec preferred_lft 2654sec inet6 fe80::a0b7:fdc4:48e8:7158/64 scope link noprefixroute valid_lft forever preferred_lft forever IP ROUTE default via 10.31.40.1 dev eth0 proto dhcp src 10.31.41.238 metric 100 10.31.40.0/22 dev eth0 proto kernel scope link src 10.31.41.238 metric 100 IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8). # Do not edit. # # This file might be symlinked as /etc/resolv.conf. If you're looking at # /etc/resolv.conf and seeing this text, you have followed the symlink. # # This is a dynamic resolv.conf file for connecting local clients to the # internal DNS stub resolver of systemd-resolved. This file lists all # configured search domains. # # Run "resolvectl status" to see details about the uplink DNS servers # currently in use. # # Third party programs should typically not access this file directly, but only # through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a # different way, replace this symlink by a static file or a different symlink. # # See man:systemd-resolved.service(8) for details about the supported modes of # operation for /etc/resolv.conf. nameserver 127.0.0.53 options edns0 trust-ad search us-east-1.aws.redhat.com 22286 1726882820.74655: no more pending results, returning what we have 22286 1726882820.74658: results queue empty 22286 1726882820.74660: checking for any_errors_fatal 22286 1726882820.74661: done checking for any_errors_fatal 22286 1726882820.74781: checking for max_fail_percentage 22286 1726882820.74785: done checking for max_fail_percentage 22286 1726882820.74786: checking to see if all hosts have failed and the running result is not ok 22286 1726882820.74788: done checking to see if all hosts have failed 22286 1726882820.74789: getting the remaining hosts for this loop 22286 1726882820.74791: done getting the remaining hosts for this loop 22286 1726882820.74796: getting the next task for host managed_node3 22286 1726882820.74803: done getting next task for host managed_node3 22286 1726882820.74806: ^ task is: TASK: Verify DNS and network connectivity 22286 1726882820.74810: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=8, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22286 1726882820.74814: getting variables 22286 1726882820.74816: in VariableManager get_vars() 22286 1726882820.74868: Calling all_inventory to load vars for managed_node3 22286 1726882820.74872: Calling groups_inventory to load vars for managed_node3 22286 1726882820.74875: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882820.74888: Calling all_plugins_play to load vars for managed_node3 22286 1726882820.74892: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882820.74896: Calling groups_plugins_play to load vars for managed_node3 22286 1726882820.78416: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882820.82448: done with get_vars() 22286 1726882820.82488: done getting variables 22286 1726882820.82565: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Friday 20 September 2024 21:40:20 -0400 (0:00:00.583) 0:00:44.219 ****** 22286 1726882820.82602: entering _queue_task() for managed_node3/shell 22286 1726882820.82978: worker is 1 (out of 1 available) 22286 1726882820.82990: exiting _queue_task() for managed_node3/shell 22286 1726882820.83006: done queuing things up, now waiting for results queue to drain 22286 1726882820.83008: waiting for pending results... 22286 1726882820.83341: running TaskExecutor() for managed_node3/TASK: Verify DNS and network connectivity 22286 1726882820.83483: in run() - task 0affe814-3a2d-a75d-4836-00000000075f 22286 1726882820.83519: variable 'ansible_search_path' from source: unknown 22286 1726882820.83541: variable 'ansible_search_path' from source: unknown 22286 1726882820.83621: calling self._execute() 22286 1726882820.83731: variable 'ansible_host' from source: host vars for 'managed_node3' 22286 1726882820.83750: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 22286 1726882820.83768: variable 'omit' from source: magic vars 22286 1726882820.84257: variable 'ansible_distribution_major_version' from source: facts 22286 1726882820.84380: Evaluated conditional (ansible_distribution_major_version != '6'): True 22286 1726882820.84484: variable 'ansible_facts' from source: unknown 22286 1726882820.85728: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): False 22286 1726882820.85742: when evaluation is False, skipping this task 22286 1726882820.85752: _execute() done 22286 1726882820.85761: dumping result to json 22286 1726882820.85805: done dumping result, returning 22286 1726882820.85813: done running TaskExecutor() for managed_node3/TASK: Verify DNS and network connectivity [0affe814-3a2d-a75d-4836-00000000075f] 22286 1726882820.85816: sending task result for task 0affe814-3a2d-a75d-4836-00000000075f skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_facts[\"distribution\"] == \"CentOS\"", "skip_reason": "Conditional result was False" } 22286 1726882820.85969: no more pending results, returning what we have 22286 1726882820.85974: results queue empty 22286 1726882820.85975: checking for any_errors_fatal 22286 1726882820.85988: done checking for any_errors_fatal 22286 1726882820.85989: checking for max_fail_percentage 22286 1726882820.85991: done checking for max_fail_percentage 22286 1726882820.85993: checking to see if all hosts have failed and the running result is not ok 22286 1726882820.85994: done checking to see if all hosts have failed 22286 1726882820.85995: getting the remaining hosts for this loop 22286 1726882820.85997: done getting the remaining hosts for this loop 22286 1726882820.86002: getting the next task for host managed_node3 22286 1726882820.86012: done getting next task for host managed_node3 22286 1726882820.86016: ^ task is: TASK: meta (flush_handlers) 22286 1726882820.86019: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882820.86025: getting variables 22286 1726882820.86027: in VariableManager get_vars() 22286 1726882820.86194: Calling all_inventory to load vars for managed_node3 22286 1726882820.86198: Calling groups_inventory to load vars for managed_node3 22286 1726882820.86201: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882820.86218: Calling all_plugins_play to load vars for managed_node3 22286 1726882820.86222: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882820.86226: Calling groups_plugins_play to load vars for managed_node3 22286 1726882820.86769: done sending task result for task 0affe814-3a2d-a75d-4836-00000000075f 22286 1726882820.86773: WORKER PROCESS EXITING 22286 1726882820.88858: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882820.91205: done with get_vars() 22286 1726882820.91228: done getting variables 22286 1726882820.91289: in VariableManager get_vars() 22286 1726882820.91302: Calling all_inventory to load vars for managed_node3 22286 1726882820.91305: Calling groups_inventory to load vars for managed_node3 22286 1726882820.91307: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882820.91311: Calling all_plugins_play to load vars for managed_node3 22286 1726882820.91313: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882820.91315: Calling groups_plugins_play to load vars for managed_node3 22286 1726882820.92453: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882820.94591: done with get_vars() 22286 1726882820.94617: done queuing things up, now waiting for results queue to drain 22286 1726882820.94619: results queue empty 22286 1726882820.94620: checking for any_errors_fatal 22286 1726882820.94622: done checking for any_errors_fatal 22286 1726882820.94623: checking for max_fail_percentage 22286 1726882820.94624: done checking for max_fail_percentage 22286 1726882820.94625: checking to see if all hosts have failed and the running result is not ok 22286 1726882820.94626: done checking to see if all hosts have failed 22286 1726882820.94626: getting the remaining hosts for this loop 22286 1726882820.94627: done getting the remaining hosts for this loop 22286 1726882820.94630: getting the next task for host managed_node3 22286 1726882820.94632: done getting next task for host managed_node3 22286 1726882820.94635: ^ task is: TASK: meta (flush_handlers) 22286 1726882820.94637: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882820.94639: getting variables 22286 1726882820.94640: in VariableManager get_vars() 22286 1726882820.94651: Calling all_inventory to load vars for managed_node3 22286 1726882820.94653: Calling groups_inventory to load vars for managed_node3 22286 1726882820.94655: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882820.94659: Calling all_plugins_play to load vars for managed_node3 22286 1726882820.94660: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882820.94663: Calling groups_plugins_play to load vars for managed_node3 22286 1726882820.95800: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882820.97986: done with get_vars() 22286 1726882820.98006: done getting variables 22286 1726882820.98048: in VariableManager get_vars() 22286 1726882820.98059: Calling all_inventory to load vars for managed_node3 22286 1726882820.98061: Calling groups_inventory to load vars for managed_node3 22286 1726882820.98062: Calling all_plugins_inventory to load vars for managed_node3 22286 1726882820.98066: Calling all_plugins_play to load vars for managed_node3 22286 1726882820.98068: Calling groups_plugins_inventory to load vars for managed_node3 22286 1726882820.98071: Calling groups_plugins_play to load vars for managed_node3 22286 1726882820.99364: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22286 1726882821.01784: done with get_vars() 22286 1726882821.01808: done queuing things up, now waiting for results queue to drain 22286 1726882821.01810: results queue empty 22286 1726882821.01810: checking for any_errors_fatal 22286 1726882821.01812: done checking for any_errors_fatal 22286 1726882821.01812: checking for max_fail_percentage 22286 1726882821.01813: done checking for max_fail_percentage 22286 1726882821.01814: checking to see if all hosts have failed and the running result is not ok 22286 1726882821.01814: done checking to see if all hosts have failed 22286 1726882821.01815: getting the remaining hosts for this loop 22286 1726882821.01816: done getting the remaining hosts for this loop 22286 1726882821.01822: getting the next task for host managed_node3 22286 1726882821.01825: done getting next task for host managed_node3 22286 1726882821.01827: ^ task is: None 22286 1726882821.01828: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22286 1726882821.01830: done queuing things up, now waiting for results queue to drain 22286 1726882821.01830: results queue empty 22286 1726882821.01831: checking for any_errors_fatal 22286 1726882821.01832: done checking for any_errors_fatal 22286 1726882821.01832: checking for max_fail_percentage 22286 1726882821.01833: done checking for max_fail_percentage 22286 1726882821.01835: checking to see if all hosts have failed and the running result is not ok 22286 1726882821.01836: done checking to see if all hosts have failed 22286 1726882821.01837: getting the next task for host managed_node3 22286 1726882821.01839: done getting next task for host managed_node3 22286 1726882821.01840: ^ task is: None 22286 1726882821.01841: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node3 : ok=75 changed=2 unreachable=0 failed=0 skipped=63 rescued=0 ignored=0 Friday 20 September 2024 21:40:21 -0400 (0:00:00.193) 0:00:44.412 ****** =============================================================================== fedora.linux_system_roles.network : Configure networking connection profiles --- 2.85s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Check which services are running ---- 2.55s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.29s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Install iproute --------------------------------------------------------- 2.20s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Install iproute --------------------------------------------------------- 1.93s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Gathering Facts --------------------------------------------------------- 1.92s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_nm.yml:6 Ensure ping6 command is present ----------------------------------------- 1.90s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:81 fedora.linux_system_roles.network : Check which packages are installed --- 1.83s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 1.28s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:3 Create veth interface veth0 --------------------------------------------- 1.24s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.02s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Check which packages are installed --- 1.02s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 1.01s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.98s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Gather the minimum subset of ansible_facts required by the network role test --- 0.87s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Check if system is ostree ----------------------------------------------- 0.83s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Gather current interface info ------------------------------------------- 0.83s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 fedora.linux_system_roles.network : Re-test connectivity ---------------- 0.74s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Get NM profile info ----------------------------------------------------- 0.67s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Check routes and DNS ---------------------------------------------------- 0.58s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 22286 1726882821.01940: RUNNING CLEANUP